Thinking ecologically often involves efficiency. If ecological thinking is about using sparse resources sparsely, being energy-efficient and minimizing waste, then this is generally good for the bottom line.
Boomers and Gen-X have known since ‘The Limits to Growth’, the Club of Rome’s report from 1972, that the way we produce and consume is no way to treat a planet.
It’s taken a while and we could dwell on that, but in recent years decision makers across all sectors have started to move the needle on the sustainability front.
Many factors play into the timing: politics and regulations, social pressure, brand image, innovation and new market opportunities, maturity of clean technologies, sincere environmental concerns, cost-efficiency. And, to be honest: Boomers and Gen-X having children (and grandchildren).
One could claim that the internet has been the most profound disruptive technology of this century so far. It has been THE catalyst for technological progress.
Since the dot-com boom, we’ve come to regard startups as the true fountains of innovation – especially internet startups. They have been the ones with radical visions, not constrained by bureaucracy or stuck in old ways of doing things, but instead eagerly adopting modern paradigms like agile software development and subscription business models.
When it comes to the climate crisis, scale matters. A lot. That’s why it’s good news that of late, large industrial players have been setting sail for a cleaner and greener future. They are transforming and re-inventing themselves.
In energy and manufacturing, in order to contribute to reducing CO2, the topics of the day are electrification with renewable energy, innovations in heat and power storage, and retrofitting combustion engines with hydrogen. On nuclear and fusion, shall we say, the jury is still out?
As we wrote in a previous post, ‘Up next: natively edge-to-cloud virtual power plants (VPPs)’, oil and gas giant Shell has put itself on course to becoming the largest electricity power company in the world.
In September 2019, Amazon’s CEO Jeff Bezos pledged to make his company net carbon neutral by 2040 and that it will use 100 percent renewable energy by 2030.
Amazon has been investing in solar and wind farms since 2015. Last December, the e-commerce and cloud services giant announced a slate of new projects around the world that would make it the largest corporate purchaser of renewable energy.
Closer to home, in November 2020, ABB announced its Green Electrification 2035 R&D program to develop platforms for optimal power generation and electricity use in order to achieve carbon neutrality, with a 20-million-euro grant from Business Finland, the Finnish government’s innovation agency.
The ABB-led platforms are to combine modern 5G communication technology, data management, new electrical engineering solutions and power grid technology.
Ecological thinking and energy efficiency was also very much at the core of the Helsinki Energy Challenge that we wrote about last week.
In a comment on that article, D.Sc. Seppo Sierla, Lecturer and Docent at Aalto University in Helsinki, wrote:
“(…) the joint optimization of the electric and heating grids sounds like one of the most significant AI problems for the coming years. Relevant data sources include district heating network automation systems, BEMS (Building Energy Management System), VPP (Virtual Power Plant), energy resource specific systems (e.g. heat pump automation), and CHP (Combined Heat and Power) plant automation systems. All of these are proprietary systems with proprietary data models, resulting in great difficulties for accessing the data and preprocessing it for AI applications. Solutions for smart data collection and edge AI data reduction in this environment are quite urgently needed. (…)”
Sierla heads Aalto University’s Predictricity team, which carries out research and development of artificial intelligence solutions with partner companies participating in demand response electricity markets.
Threats and opportunities
From an ecological as well as a cost-benefit perspective, the success of all these ambitious projects depends heavily on how well we manage to utilize data and optimize operations with the help of data.
As we’ve been saying, a data tsunami is upon us and its impact on our society is transformative. It comes with threats and opportunities.
The amount and variety of data that we will have access to can be overwhelming and even paralyzing. There will be a lack of standardization at first, and confusion as to where one should start to get a grip on it all.
But it also represents a tremendous opportunity to manage everything more ecologically and cost-effectively.
Take camera-assisted object recognition. From IoT to retail to road maintenance, the number of potential use cases is only limited by the imagination. Problem is, video is rather data-heavy. If all that data needs to be sent over a telecom network to a central computer location or cloud, to be processed, analyzed and possibly stored, it quickly becomes prohibitively resource-intensive (= expensive).
To make things worse, the internet was designed to be optimized primarily for downlink rather than uplink data traffic, which means that for the time being, there is a real technical bottleneck to this scenario.
The footprint of data transfer
However, if the purpose is to count people in a queue, all we really need is a number – not the pictures. With edge AI we can process the image data immediately and throw more than 99 percent of it away.
Computing and reducing data at the edge also helps prevent the proliferation of data lakes. Under the mantra of Big Data, many organizations have been creating data lakes in hopes of extracting value with AI. Disappointment often sets in as lakes tend to become swamps. Our CEO Henri Kivioja had a few things to say about that in ‘Energy IoT shouldn’t cost an arm and a leg’.
Now, Mats Eriksson, a Lead Business Developer with TietoEvry, recently took the argument for edge computing one ecological step further.
In addition to the technical restraints and budgetary objections to sending data to the cloud for processing, Eriksson rightfully made the point that edge computing can also significantly reduce the CO2 footprint of data transfer.
According to a model he developed, edge computing will help reduce global energy consumption and related greenhouse gas emissions caused by data transport by up to 60 percent.
Nodes on a canvas
Our position is that data should always be processed and reduced as much as possible, as close as possible to where it is generated.
Our solution leverages the edge-to-cloud continuum with a horizontal software platform that can be deployed across virtually any combination of operating systems.
It facilitates unification of proprietary standards at the source with the help of custom nodes. It sports a library of inputs, processes and outputs that users can mix and match to create their unique enterprise data applications, simply by dragging, dropping and connecting nodes on a canvas.
Third parties can add their own AI algorithms to the library for everyone to use – or just for themselves. Of course we can help them re-use or create the right AI for their specific needs.
Data utilization and optimization is our bread and butter. We talk with organizations every day about the data they use (and don’t use!) to optimize their operations.
Are you considering how your data could improve your business? In our experience, there is often plenty of low-hanging fruit to be had. Together, surely we can figure out where to start. If you’re interested, have a chat with our CEO. You can book a call with Henri without any obligation.
Maybe engineers will save the planet after all.