Edited transcript of the episode:
Smarinita: Artificial Intelligence or AI, as it is more commonly known, and tech are emerging as potential solutions to many of our climate change worries—whether it’s helping countries and businesses track and manage emissions and become energy efficient or improving our ability to predict extreme weather events. In a city in Japan, for example, a consulting firm has launched digital services that can send disaster alerts to safeguard its residents.
However, AI is not the magic wand that it’s touted to be—it can’t make climate change go away. In fact, there are several downsides to it. Training of algorithms and storage of large AI models, for one, consume extremely large amounts of energy. Then, there is the whole question of who gives and who gets access to the immensely large quantities of data that AI systems use.
While the richer countries with the resources to research, create, and deploy these technologies get to shape the technology and its goals, and leverage its benefits first, the most vulnerable communities may get left behind.
In India, there is very little conversation about what using tech and AI for climate change can actually mean. And the role that the government, the big tech companies, and ultimately the people need to play.
Talking about some of this and more, today I have with me Jim Fruchterman and Trisha Ray.
Jim is a leading social entrepreneur, a MacArthur Fellow, and the founder and CEO of Tech Matters, a nonprofit technology company based in Silicon Valley. For more than 30 years, Jim was the CEO of Bentech, a nonprofit that develops software applications for social good in the areas of education, employment, and social inclusion. Jim regularly works with various social enterprises on the use of technology and data.
Trisha is a deputy director at Observer Research Foundation’s Centre for Security, Strategy and Technology. Her research focuses on the security implications of emerging technologies, AI governance and norms, and lethal autonomous weapons systems. Prior to this, Trisha worked at the Asia Society Policy Institute in Washington, DC, where she focused on national AI strategies in Asia, nuclear issues, and India–US security relations.
Hi, Trisha. Hi, Jim. Welcome to this podcast.
Smarinita: Jim, coming to you first. Could you talk a little bit about the role that tech and AI have been playing in addressing some of the problems brought on by the climate crisis?
Jim: So right now I would say that AI is being used mainly by the academic community, the scientific community, to do climate models, to understand that we do have a climate crisis, to keep measuring how things are, to forecast into the future—needs something like AI to help you do that. I would say that we’re weaker on the local modelling. So in other words, local weather monitoring and modelling isn’t that good across most of the world, it’s great in certain richer countries. So we know what’s going on at the world level. But often knowing what’s happening in your district, we may not have bothered to do that modelling. And I think that much of where I want to see AI being used is to help the local farmer, the local community leader, to actually understand better what’s going on, what might be going on, and help them make decisions within their own priorities and context. And I think AI is not there yet. Well, I think people are using technology more broadly. And a lot of ways to deal with climate, the biggest areas that you see is energy, people trying to innovate around energy, solar panels, you know, are not necessarily considered an advanced technology any more. But it’s a big part of this wind power. And I think that, for example, there was recently a billion-dollar commitment to buy the output of technology that took carbon out of the atmosphere, something that is not economic today. But certain companies said, “But we’ll pay you a lot of money per tonne, far more than the market value,” to hopefully jumpstart a market to take carbon out of the atmosphere. So I think there’s a lot of people who are aware that we have a climate crisis going on, and they want to use technology. And by and large, the focus is on how can we make money on the impending climate gold rush. But I think there are people who want to get that going, especially among philanthropists, tech companies want to signal things.
Smarinita: And you’ve also seen examples at a community level of tech being used, whether that’s in your work in Africa or elsewhere?
Jim: Yeah, I would say right now we’re part of a community that is trying to bring the benefits of technology to local leaders. And it’s called landscape focus, the idea that it’s more than an ecosystem. It’s got humans in it, and it’s got nature and it’s got food production in it, and energy, everything else, how do you actually make better things happen at the local level, and local leaders don’t think very much about climate. They’re thinking about how do we increase farmers’ incomes, when we look like we’re going to have less water than we’ve ever had before. So they’re kind of aware that things are changing, but they want to focus on local issues. And so a lot of this landscape approach is to try to make those local leaders, smallholder farmers, producers more powerful, and this is being led primarily by the traditional aid and biodiversity and conservation organisations, because they realise they’re never going to meet their goals, if local communities are making, let’s say, climate-dumb decisions instead of climate-smart decisions.
Smarinita: Trisha, in India, there isn’t enough conversation about the actual impact that big tech and AI can have on climate itself. What does it mean for carbon emissions? And who actually gets access to the technology being developed? Could you talk a little bit about what’s happening when it comes to AI and climate mitigation in India?
Trisha: Sure, Smarinita. So as Jim mentioned, there is a lot of good that tech like AI can do, including in helping us track, predict, and mitigate climate change. We could even use AI to invent new synthetic materials to replace the ones we currently use, and materials that are more resistant to heat or materials that can help us store solar energy better. There’s also a lot of interest in a concept called smart grid. And India has its own National Smart Grid Mission, where one could use AI to detect patterns in how people use electricity and how this may change over time and then act accordingly.
Training a single natural language processing model emits as much carbon dioxide as a car in its entire lifetime.
But the problem with how we talk about tech for climate is that we think it’s a clear-cut solution that will neatly fix all our problems. So that said these applications have a cost and nor can we or should we decouple our thinking about the solutions, as you mentioned, as well with how people and systems work. So let me explain that a little bit. One way in which we can see this problem is that it takes a lot of energy to train an AI model. And there’s a study that attempts to quantify the carbon emissions of machine learning. And it says that training a single natural language processing model emits as much carbon dioxide as a car in its entire lifetime. So that’s a lot. Now you’re seeing the idea of environmentally sustainable AI also kind of gradually coming into the mainstream. As Jim mentioned, there are some big tech companies that are already acting on this. UNESCO last year also adopted a recommendation on the ethics of AI that includes minimising the environmental impact of AI as one of the principles. But we have to remember that these developments are not happening in a vacuum, nor do they affect everybody in the same way. Right now, the social and economic benefits of AI are accruing to a privileged few countries, and a major concern of developing and underdeveloped countries is that you have data flowing out of their user citizens, and you’re having services and products flowing in. So they’ve become data suppliers and product buyers. The name for this concept is data colonialism. [Also] sub Saharan Africa, Latin America, the Caribbean, South and Central Asia really falling behind in AI development and use in terms of start-ups, funding skills, and so on. So there are many aspects to this issue that we need to look at more closely.
Smarinita: That’s an interesting point you’re making Trisha, which is, who produces the data? And who uses it? And we’ve heard this sentence being used often, right, that technology, especially in a country like India, has the ability to democratise and divide.
Jim, you spoke a little bit about why there is a need to get communities to understand and adopt technology so that it doesn’t lead to maladaptation in the long term. But how do we pivot from this seemingly top-down approach when it comes to climate change and tech? How do we move the decision-making power from people with resources, networks, and influence to the communities themselves who are going to be most impacted?
Jim: So I think that the existing dominant tech business model is extractive. We farm you for your data, and we become billionaires, and we use your data against your interests. Overall, though, it may be in the form of advertising a product you don’t need and convincing [that] you need to buy it or voting for a politician who isn’t actually in favour of your interests. So I think a lot of what we need to do to see data and the AI that is built on top of that data to be used more in the interest of global society, local communities, patient states, is to recognise the power of data, especially data in large amounts; the data of one farmer is not worth anything, really, the data of 1,00,000 farmers is very interesting, and economically interesting. And so this idea of data colonialism is pervasive. Nithya Ramanathan and I have written several pieces on the need to decolonise data, aimed not at the companies because they’re doing total clinical data processing, but at the government, at nonprofits, to understand that while they’re delivering community empowerment, that can’t be at the price of extracting the data from the community, and then using it to punish that community, which is actually [to say] ‘pay for performance’ can look like ‘punish you for the outcomes’ that we measure in your data.
We don’t have an easy way to share data right now for social benefit.
And so part of this is shifting the power from a unilateral extractive model to one where we understand that there’s such a thing as a rightful data owner, and that we’re going to use her data, not for instance with checkbox consent, but meaningful consent, that meaningful benefits flow to that community. And I think one of the things that’s been identified as a big need is we don’t have an easy way to share data right now for social benefit. So can there be an open-source license or a Creative Commons license for private data that actually gets used to benefit a farmer so that they know that if their data is used, that it’s actually going into a model that’s going to help their communities help them, more than it’s going to help the fertiliser maker, the maker of the agricultural equipment, the supply chain actor who’s going to try and by and large use data to get a lower price from them? How are we actually going to shift that power? And I think that’s a big theme of how I think the social sector should discard the for-profit business models and actually engage in empowering the communities that they serve. And this, this also includes government, national governments can sometimes be very colonial in the way they treat local government leaders.
Smarinita: Trisha, do you think we need to move towards a more bottom-up approach as well? How can communities, that are being impacted by climate change day in and day out, adopt tech and AI in a manner that’s most suitable for them?
Trisha: Right. So I think the best solutions come from the people who live with the problems. There’s an interesting example in relation to environmental crimes and the Amazon. The situation there is a little tricky, because you can’t always rely on the government or local authorities, because they might often be the ones participating in displacing communities and encroaching on the forest. The Brazilian government also dismantled the organisation in charge of environmental monitoring and protection. So there’s not a lot by way of official resources, you can use that. So there’s this nonprofit called Rainforest Connection that uses AI to monitor ambient sound and alerts local team members if it detects logging or poaching activity. So there’s the sound of a chainsaw or something like that. They are building a project in the temporary resolve in the Amazon. But they’re partnering with the local indigenous tribes and their rangers to use that AI to monitor and then combat poaching and illegal logging. So that’s quite an interesting example that really involves the communities being most affected.
Smarinita: Both of you have spoken about how tech and AI are being used at a community level and the role that civil society can play in facilitating some of this. But what about the big tech companies? While they continue to build larger and larger AI models, which means more emissions and more energy being consumed, what needs to be done to hold them accountable?
Jim: Well, it’s a challenge, because tech companies are very powerful, and have this habit of being able to kill legislation that works against their interests. And, you know, I’m based in California, which as a state has some of the best privacy legislation in the United States, but it’s half as good as the Europeans have with their privacy regime. But we haven’t had any big privacy or data use laws passed in major countries, including India, in quite a while. And yet who can offset the power of the tech industry? It’s pretty much government. I mean, that’s the only other power centre that I’m aware of that’s going to do anything about this. And the tech companies, they have social good arms. But the idea of the social good arm is how do we spend, you know, 0.1 percent of our profits, or 1 percent of our profits. And then let’s not talk about what we do the other 95 percent of the time. And so when we’re doing social good, because we’re 10 or 15 years behind the times, we’re quite tiny in terms of our use of these models. And we’re building on the back of these large models. And thank goodness there is an active science community who has an interest in social good, that’s another power centre, but by and large they’re interested in publishing papers as opposed to helping farmers on the ground. That’s a big distance between academia and the average farmer. It’s a real challenge. And I like to think that the consumer will demand change, but that we haven’t seen that.
Trisha: So I disagree with Jim a little and that, as consumers, we are aware of how devastating climate change is. I mean, in Delhi, we just had a terrible heatwave; Seoul is currently in the middle of the worst flooding they’ve seen in 80 years, these effects are really tangible. And most consumers do want to make some change in how they consume and how they behave to have a positive impact. And that is what I think is impacting tech companies as well. Most major technology giants have announced net-zero policies and initiatives; Amazon, Microsoft, Alphabet, and Facebook have all announced some of these initiatives, which are a good sign, but there’s still a lot that needs to be done. And the first and most fundamental issue, I think, is that these net-zero initiatives often rely on something called the carbon offsets system. So simply put, say, I’m Amazon, and I cut tens of thousands of trees to make cardboard boxes, I can then find a nonprofit that plants x number of trees somewhere in the world and say that my net impact on the environment is zero. That’s not how things work in the real world. There is no simple equivalency. And what I should be doing as Amazon is fundamentally changing how I operate, and being more transparent about the environmental impact of my operations.
Smarinita: Jim, you brought out an interesting policy element to this conversation about the vacuum that exists when it comes to privacy and data laws. Why do you think this is? Is this because there’s a lack of understanding of the impact of some of this within governments?
Jim: By and large, I’m pretty impressed with what our bureaucrats know. I think it’s the politics of these issues that stopped the progress. It’s not to say a lack of awareness. It’s just where’s the actual power in society, we just barely passed our first climate change bill in the United States. I mean that, just barely. So I think there’s this power imbalance because the tech companies are very rich, and they are regarded as important to the economies of their countries. So the words of tech barons are listened to in the halls of power, even if, you know, the bureaucrats are saying, but there’s this problem.
Right now, AI is not on the radar of major political groups or activists.
You know, I agree with Trisha that consumers have power, as it affects politics, I think I’m more sceptical about their ability to directly change the behaviour of tech companies by their behaviour, because they still use these big tech companies, because these services are free. They’re very seductive in that way. But I think the more that society is aware of the impact of climate change, and is made aware of the different components of climate change, and I think that’s [what] Trisha’s point is [that] no one’s really thinking about that AI is a big piece of the energy use of the tech industry as there’s very little awareness of this. And so I think, as awareness grows, then people put pressure on that. So people are quite aware of the, say, the impact of coal on the climate. And so you’re going to hear a lot about how do we get out of coal? How do we get to more, you know, sustainable, regenerative kind of energy sources? But I think I agree with her that right now, AI is not on the radar of major political groups or activists, compared to the things that they have been made aware of over the last 20 years of campaigning in the face of, let’s say, confusing information from the fossil fuels industry.
Smarinita: So what will it take to build this kind of awareness? How do we get people to understand the impact of big tech and AI? Trisha, what do you think?
Trisha: A big problem with policymakers who may want to engage with the issues or are not even aware of the issue is that there’s a lack in tech fluency. So it is the duty of industry, of civil society, of media to highlight these issues and then not just talk at regulators, but really engage with them in a way that they understand. One example I think we can all relate to or it’s been in the news quite a lot is the energy impact of bitcoin mining. It’s everywhere. We’re all now aware of it and that’s because it was just everywhere on the media. So there is a role that media can play in bringing some of these more niche issues on to our radar. We may also want to think about whether we should even be using AI in some situations. There are two fundamental principles of international humanitarian law. Those are necessity and proportionality. So is AI necessary in a given context? Are there alternate solutions that are low tech, but perhaps less intrusive? Is the sheen of neutrality of AI solutions distracting from some harmful consequences that they have? We should be delegating these solutions rather than implementing them blindly.
Jim: And just follow up. I mean, I’m a tech nerd from Silicon Valley. So I’m pretty far from the media. But I think the role of Hollywood and Bollywood in changing consumer opinions should not be discounted. And one of my early backers is a guy named Jeff Skoll, who was one of the first two people at eBay. And he created a movie company in Hollywood that was going to highlight social issues, and we all joked how much money he was going to lose trying to do this. And Participant Media has actually really worked. And, you know, he backed An Inconvenient Truth, which was one of the more influential films and certainly in the United States about awareness of climate change. I do think that it’s not just the traditional news media, but it’s also our film industries, that are often sort of opinion and sentiment leaders, if they can turn something into a story, it’s not easy to turn AI into a story unless it’s, you know, the Terminator movies or something like that. So I think we have to kind of think about this, but I think that does move a society.
And to Trisha, second point, I think it’s relatively easy for us who are working on tech AI for social good. We’re going to use AI to help disabled people, we’re going to use AI to help people figure out what’s going on in their district, those are easy. Those again, represent .01 percent of the AI field, if we’re being very successful. So it’s easier to argue for, I think, for these small, and let’s say small-scale uses of AI, and we’ll be small scale for the foreseeable future, compared to large-scale uses of AI that are more comparable to bitcoin, which is hard to defend on a climate basis. Its benefits are not in proportion to its climate impact is a pretty easy statement.
Smarinita: Lastly, if you could get tech companies to listen, if you could get governments to listen, and if you could get the media to listen, what would you tell them, whether it’s an incentive or a stick?
Jim: I’ve been working on AI for social good for more than 30 years. So when I think about what I would like to have changed, I would like to have access to the data and the models that the tech industry have, rather than on a tiny crumb of a budget trying to recreate this data. And I think we often think of our natural resources as belonging to the citizens of a country. And what we do say with our oil wealth should actually benefit all of our community. How do we actually do that when it comes to tech? How do we see that it’s not just used for commercial reasons, but actually for public health reasons, for climate change reasons, for educational reasons. And that’s not the status quo. And so I would love to see nations say, hey, data is not like oil. If I give you a copy to do something for social good, it doesn’t cost you anything. It doesn’t take anything away from your advertising business. And yet we treat data like it’s oil or gold and can only belong to one company. And I find that very incompatible with society’s interests.
Trisha: AI and tech and climate, they’re all sexy fields. But there’s like a dearth of funding for some of the more basic l academic research on some of the less explored, less commercially viable aspects of some of these issues. One example, this is not related to climate, but Bangla as a language is really difficult to digitise and use for NLP applications. And right now the institution doing the most work in preserving a language digitally that’s spoken by millions of people is one university in Dhaka, and they’re not really that well funded either. So that’s, I think, a very fundamental problem—funding for these less sexy academic applications.
Smarinita: You know, often when it comes to conversations on AI and technology, and even climate change for that matter, all the technical jargon can be difficult to follow. But both of you have laid out the pros and cons so coherently for me, and hopefully for all our listeners as well.
Jim, you seem to be optimistic about how technology and AI can solve some of our climate change problems, but you’ve caveated that with the fact that the space needs to be regulated. This is especially true if we want to shift the power from those with the money and influence to those actually being impacted by climate change. You on the other hand, Trisha, are clear that technology may further increase the inequity between countries, and within as well. But you’ve also highlighted some of the good that tech and AI can achieve, if used correctly.
So thank you so much, Trisha, and thank you, Jim, for being on this podcast. We hope that some of this sparks larger conversations around both the benefits as well as the concerns of tech and AI on climate. Thank you.
- What is ‘dark data’? And how is it killing the planet?
- Here’s how AI can help fight climate change
- Artificial Intelligence could be a game-changer for India’s climate strategy
- Artificial intelligence technologies have a climate cost
- Data for actual empowerment
- How artificial intelligence can tackle climate change
- AI can help us fight climate change. But it has an energy problem, too
- Bengaluru-based think tank advocates use of artificial intelligence and machine learning in traffic and water management