Edge computing will become more critical in the coming years, as smart cities will deploy millions of new IoT devices. Wireless power will be necessary.
For the bulk of human existence, our bodies and brains handled all data processing. We sensed our environment, analyzed the data in real time, and then made decisions to survive. Today, our world is far more complex, and the demands of daily life go beyond survival, we’re now outsourcing those three jobs to devices. But we’ve run into a dilemma: We need more and more of these devices distributed in our environment, and the smarter we make devices, the more power they consume.
This power dilemma goes way beyond the smartphone we carry daily. The Internet of Things (IoT) will swell from over 15 billion connected devices today to 200 billion by 2020, says Intel. These devices are everywhere –in factories, retails stores, hospitals, homes, and your personal devices. Most of these run on batteries or expensive wiring. They collect data and transmit it to cloud data centers where the real brains reside. To make the batteries last, manufacturers make the devices frugal power consumers.
See also: How electric cars will fill up on the fly
By 2022, however, Gartner thinks that 50 percent of enterprise-generated data will be processed outside the cloud, in or near the devices. This “edge computing” offers advantages for security, data costs, and most importantly, response times. Who would ride in a self-driving car that depended on a cell connection to avoid crashing?
We want devices to act autonomously without dependence on internet connections. But we’re trying to provide the additional power they need with wires and batteries. Those power distribution methods cannot scale to 200 billion devices. We need wireless power to overcome their limitations.
By “wireless power,” I mean real wireless power that charges devices from a distance, without line-of-sight, much like Wi-Fi. Wireless power is essential to the growth of IoT and edge computing for several reasons.
#1: Wires are expensive, inconvenient, and limiting
Wires are poorly equipped to power billions of smart devices. We can work with some numbers to examine why.
The Census Bureau says the average size of a new house in the U.S. is 2,687 square feet. That’s roughly a four-bedroom home. Let’s estimate that it has 80 electric sockets. The national average cost of installing a socket is $203 according to Home Advisor.
Thus, a homeowner pays over $16,000 to wire the home yet spends an average of $112.59 per month on electric bills according to the Energy Information Administration’s (EIA). Easily 90 percent of those sockets are empty. Essentially, homeowners spend 12 years’ worth of electricity just to have useless sockets. They need only a tiny fraction of those sockets to use high-powered appliances such as TVs, fridges, and washers. Installed at home, real wireless power could cover the rest of their devices.
Consider that some of the most desirable IoT devices – security sensors, thermostats, and door locks, for example – can’t even use existing sockets. It costs roughly $500 to wire one such device (as I’ve learned the hard way).
Now, scale this cost from one home to a ‘smart’ factory, office building, hospital, supermarket, or city. One industrial-grade sensor costs roughly $50, while the wiring job in industrial spaces can escalate to $1,000. If the world has an appetite for 200 billion connected things in two years, powering them with wires would be cost-prohibitive. We might outspend each device by as much 10x or 20x to give it power.
#2: Batteries are even less efficient and limit intelligence
Disposable batteries have become a stopgap for powering technology that can’t depend on wiring, but they’re not a sustainable solution. Annually, Americans purchase nearly 3 billion dry-cell batteries, which contain toxic heavy metals.
Dumping these materials in landfills damages our ecosystem and harms the communities that mine them. Even if we toss aside the ethical arguments against disposable batteries, the economics make little sense.
One kilowatt-hour of power from an electric socket costs 10 cents. One AA battery offers one watt-hour for roughly 50 cents. Therefore, power from the AA costs 5,000 times more than the power from a socket.
Devices that rely on disposable batteries can’t support much intelligence. Consider a battery-powered motion sensor. If you threw a CPU into the sensor, it would guzzle power. That’s why security sensors can’t tell the difference between your dog and robbers. The sensor is designed for battery preservation, not intelligence.
Rechargeable batteries, though better than disposables, still suffer the tension between intelligence and power. Unless the device can support a giant rechargeable battery, it needs to outsource data processing to the cloud. In other words, a Tesla Roadster can support edge computing, but anything that fits in your pocket cannot.
#3: Wireless power can charge the edge
Until recently, cloud infrastructure was sufficient. If Facebook or YouTube suffer latency one day, no one dies. Their intelligence stays in the cloud to spare your phone’s battery. However, the most anticipated inventions of the 21st century can’t afford latency. They must sense, analyze, and act as quickly or faster than a human being. In many cases, wireless power resolves the conflict between power consumption and intelligence.