65 avsnitt • Längd: 25 min • Månadsvis
Fixing the Future from IEEE Spectrum magazine is a biweekly look at the cultural, business, and environmental consequences of technological solutions to hard problems like sustainability, climate change, and the ethics and scientific challenges posed by AI. IEEE Spectrum is the flagship magazine of IEEE, the world’s largest professional organization devoted to engineering and the applied sciences.
The podcast Fixing the Future is created by IEEE Spectrum. The podcast and the artwork on this page are embedded on this page using the public podcast feed (RSS).
Gabriel Steinberg, co-founder of the nonprofit Demining Research Community and the startup Safe Pro AI talks with Spectrum editor Eliza Strickland about using machine learning to speed up demining operations in former Ukranian battlefields.
Founder and CEO of Exeger, Giovanni Fili, talks with IEEE Spectrum editor Stephen Cass about Exeger's Powerfoyle flexible dye-based solar cells for consumer electronics, which can recharge devices even in indoor light, and how Exeger convinced major companies to incorporate its tech into their products.
The United Kingdom has created a new government agency, the Advanced Research and Invention Agency, or ARIA, similar to the United States' DARPA. ARIA's first foray is into creating new enabling technologies to make AI faster and more energy efficient, and the program director, Suraj Bramhavar spoke with Spectrum editor Dina Genkina about some of the new avenues that ARIA would be helping investigate.
Zipline originally established itself delivering medical supplies in rural Africa. Now, Zipline cofounder and CTO Keenan Wyrobek talks with senior editor Stephen Cass about recent milestones in bringing commercial drone delivery to the United States, including the development of Platform 2 and its tethered mini-droid that makes precision drop-offs possible in urban areas.
Governments in America and Europe are pushing the deployment of heat pumps to reduce the energy demands of home heating and cooling. Spectrum's power and energy editor Emily Waltz talks with Stephen Cass about her reporting on new advances that will let heat pumps work in colder climates than before, expanding their range considerably.
IEEE Spectrum's semiconductor expert, Samuel K. Moore, talks with Stephen Cass about his visit to one of the key conferences in emerging integrated circuit technology, ISSCC. We talk about Meta's new 3D chip-stacking tech for faster AR, faster AI through in-memory computation, and security technology that can cause a chip to self-destruct if anyone tries to hack it.
In this March roundup, IEEE Spectrum's editor-in-chief Harry Goldstein and senior editor Stephen Cass talk about some of the highlights of Spectrum's recent coverage, including a plea for programmers to stop producing bloated programs, a new transistor that could help make how we handle electrical power smarter, and the potential return of optical discs as a high-density date storage medium.
The Air Force Research Laboratory (AFRL) recently released the open-source ARES_OS, a key software component of their Autonomous Research System. ARES_OS allows relatively simple robots to perform experiments, and develop new experiments based on the results. The AFRL's Benji Maruyama talks with IEEE Spectrum associate editor Dina Genkina about how he hopes the system becomes not just an invaluable helper for grad students, but opens up research to many more people outside traditional labs and enables progress in tackling hard problems like climate change.
The semiconductor industry is in the midst of a major expansion driven by the seemingly insatiable demands of AI, the addition of more intelligence in transportation, and national security concerns, among many other things. What this expansion might mean for chip-making's carbon footprint? Can we make everything in our world smarter without worsening climate change? Lizzie Boakes is a lifecycle analyst at IMEC, the Belgium-based nanotech research organisation, and she speaks with senior editor Samuel K. Moore about her work on this problem.
We've all seen impressive demos of prototype brain implants being used by paralyzed patients to interface with computers, but none of those implants have entered general clinical use. Biomedical device company Synchron is close to actually coming to market with its stentrode technology, promising less spectacular results than some of its competitors, but making up for that with ease of use and implant longevity. Synchron's co-founder Tom Oxley talks with IEEE Spectrum senior editor Eliza Strickland about the new tech, and you can read more in our January issue article by Emily Waltz.
The EU Sustronics program aims to make creating, maintaining, and recycling electronics more sustainable. Liisa Hakola is a senior scientist and project manager at the VTT Technical Research Center in Finland. She talks with IEEE Spectrum senior editor Stephen Cass about VTT's role in the EU's program, helping manufacturers to develop flexible, printed—and even compostable—electronics.
Security researchers Bruce Schneier and Barath Raghavan believe it's time to stop trusting our data to the cloud, where it can be exposed by greed, accident, or crime. In the December issue of IEEE Spectrum, they proposed a plan for "data decoupling" that would protect our data without sacrificing ease of use, and in this episode Raghavan talks through the highlights of the plan with Spectrum editor Stephen Cass.
Co-CEO's of Silmach, Pierre-Francois Louvigne and Jean-Baptiste Carnet, talk about their new MEMS technology with IEEE Spectrum editor Glenn Zorpette. The tech has been used to create the first major upgrade to the movement of quartz watches in decades, a power efficient motor that is 50 percent smaller, allows fluid forward-and-back motion of the hand, and requires so little power a watch can run for over a decade before it needs a new battery. Louvigne talk about their new hybrid watch, which combine smartwatch electronics with analog faces, and partnerships with manufacturers such as Timex.
Alan Clark of SUSE talks with IEEE Spectrum editor Stephen Cass about the disruption in the enterprise Linux community caused by recent announcements by Red Hat over open source access to its codebase, and the formation of the Open Enterprise Linux Alliance (Open ELA) by SUSE, Oracle and CIQ in response.
Justine Bateman is an author and filmmaker. She also holds a degree in computer science from UCLA and is the AI advisor to SAG-AFTRA, the actors' union currently striking against movie and television studios. In this episode, Bateman talks with IEEE Spectrum senior editor Stephen Cass about actors' demands for control and compensation over digital avatars created in their likeness, and the destructive potential of generative AI in Hollywood.
Wendy H. Wong is a professor of political science at the University of British Columbia, and author of the just released book, We, The Data: Human Rights in the Digital Age. An excerpt from the book regarding the emerging prospect of digitally reanimating the departed is available on IEEE Spectrum's website. In this episode of Fixing The Future, Wong talks with senior editor Eliza Strickland about how the increasing datification of our lives could make this prospect possible—with or without our consent.
IEEE Spectrum's resident semiconductor expert Samuel K. Moore talks with host Stephen Cass about ASML's enormous machine that's at the heart of chip manufacturing and explain the latest tricks with extreme ultraviolet that will keep Moore's Law going. In addition, new technologies from Edwards and Nvidia should make manufacturing chips greener and faster respectively.
Reducing our global carbon footprint by switching to electric vehicles means we need a lot more batteries. And that means we need a lot more copper, nickel, cobalt, and lithium to make those batteries. Josh Goldman of KoBold Metals talks to senior editor Eliza Strickland about using AI to decipher geological formations and find new deposits of these minerals, and you can read more in his recent feature for IEEE Spectrum.
IEEE Spectrum's Stephen Cass talks with Arun Gupta, vice president and general manager of Open Ecosystem Initiatives at Intel and chair of the Cloud Native Computing Foundation, about Intel's contributions to open source software projects and efforts to make open source greener and more secure.
Around the world, legislators are grappling with generative AI's potential for both innovation and destruction. Russell Wald is the Director of Policy for Stanford's Institute for Human-Centered Artificial Intelligence. In this episode, he talks with IEEE Spectrum senior editor Eliza Strickland about creating humane regulations that are able to cope with a rapidly evolving technology.
Scott Shapiro is the author of Fancy Bear Goes Phishing: The Dark History of the Information Age in Five Extraordinary Hacks. You can read an excerpt of Fancy Bear at IEEE Spectrum, but in today's episode of Fixing the Future, Shapiro talks with Spectrum editor David Schneider about why cybersecurity can't be fixed with purely technical solutions, why the threat of cyberwarfare tends to be exaggerated, and why cyberespionage will always be with us.
As large language models like GPT4 and Bard continue to take the world by storm, one of their most high-profile applications is their most unexpected: writing code. AI programming systems like Github Copilot are primarily used by software developers as a writing partner, but no-code programming tools can also help non-programmers find new ways to use data. AI-watcher Craig Smith talks to Gina Genkina and explains how this programming ability caught researchers by surprise and how anyone can start leveraging these tools.
Sally Adee's new book, We Are Electric: The New Science of Our Body’s Electrome, exams the centuries-long quest to understand how the body uses electricity. Beyond just how neurons send electrical signals, new research is showing how ancient biological mechanisms use electricity to heal our bodies and dictate how cells behave. Adee, a former editor at IEEE Spectrum, talks with host Stephen Cass about this research and how it may even open the door to regenerative technologies that are currently science fiction.
Samuel K. Moore, IEEE Spectrum's senior editor and semiconductor beat reporter, talks about the competing technologies that hope to dramatically speed up computing, especially for machine learning.
Charles Scalfini, the CTO of Panoramic Software, makes the case for why programmers should make the leap to functional programming, which promises more maintainable code, and eliminates some of the problems inherent to conventional languages.
Nick Brown, vice-president of product at Truepic, describes how the company's technology and standards developed by the Coalition for Content Provenance and Authenticity is fighting fakes and other forms of image tampering, by securing data from the camera lens to the users' screens.
Patients who have traumatic nerve injuries can face significant paralysis, including paraplegia and quadriplegia. Chad Bouton's research is on developing devices that can decode and recode the electrical signals that normally flow between a limb and the brain, allowing damage to be bypassed.
One potential path to tackling climate change due to rising carbon dioxide levels is to lock the carbon dioxide away in geological reservoirs deep underground. Deep learning AI technologies can produce better models of these reservoirs, essential if they are to be used at a big enough scale to make a difference.
Britt S. Young talks with IEEE Spectrum senior editor Stephen Cass about her investigation into high-tech prosthetic hand design: "We are caught in a bionic-hand arms race. But are we making real progress? It’s time to ask who prostheses are really for, and what we hope they will actually accomplish. Each new multigrasping bionic hand tends to be more sophisticated but also more expensive than the last and less likely to be covered (even in part) by insurance. And as recent research concludes, much simpler and far less expensive prosthetic devices can perform many tasks equally well."
Silver Lining's executive direction Kelly Wanser explains why rising temperatures are behind the push to geoengineer the world's climate, the most plausible technologies, and why we need a lot more research to find out if it's a good idea, and if so, how to do it on a global scale. Hosted by IEEE Spectrum editor Eliza Strickland.
Hospitals are where we go to get cured of infections and diseases, but sadly, sometimes tragically, and ironically, they are also places we go to get them. According to the Centers for Disease Control, “On any given day, about one in 31 hospital patients has at least one healthcare-associated infection.”
Yet, according to Dr Lee Harrison, “The current method used by hospitals to find and stop infectious disease transmission among patients is antiquated. These practices haven’t changed significantly in over a century.”
Until perhaps now. Doctors at the University of Pittsburgh and the University of Pittsburgh Medical Center have developed a new method that uses three distinct, relatively new, technologies, whole-genome sequencing surveillance, and machine learning, and electronic health records to identify undetected outbreaks and their transmission routes.
Dr Lee Harrison is a Professor at the University of Pittsburgh, where he’s the Associate Chief of Epidemiology and Education and, more to our point today, the head of its Infectious Diseases Epidemiology Research Unit. He’s the corresponding author of a new paper that describes the new methodology and he’s my guest today.
Fixing the Future is the weekly podcast of @IEEE Spectrum and is sponsored by @COMSOL
Rare diseases are, well, rare. In two not unrelated ways. By definition, they’re diseases that afflict fewer than 200,000 people. But because, in the world of big business, in particular big pharma, that’s not enough to bother with, that is, it’s not profitable enough to bother with, rare diseases are rarely worked, to say nothing of cured.
For example, hypertryptophanemia is a rare condition that likely occurs due to abnormalities in the body's ability to process the amino acid, tryptophan. How rare? I don’t know. A Google search didn’t yield an answer to that question. In fact, it’s rare enough that Google didn’t even autocomplete the word even with 15 of its 19 letters typed in.
Paradoxically, big data has the potential to change that. Because 200,000 is, after all, a lot of data points. But it presents problems of its own. There isn’t one giant pool of 200,000 data points. So the first challenge is to aggregate all the potential data that’s out there. And the big challenge there is that a lot of the data is contained, not in beautifully homogeneous, joinable, relatable databases. It’s buried deep in documents like PubMed articles and patent filings.
Deep Learning can help researchers pull that data out of those documents. At least, that’s the strategy of a startup called Vyasa. Here to explain it is Vyasa’s CEO and founder, Christopher Bouton.
Like a lot of people, you may be thinking about trading in your car. Me too. The case, morally and even financially, for an all-electric car is becoming stronger and stronger.
And yet, what about recharging?
What’s it like going from, say Pittsburgh to New York’s Hudson Valley—a trip that doesn’t even have a solid cellular connection? What about a road trip my partner to Yosemite and back? And even locally, how do you charge up if you live in a townhouse or apartment? Without a driveway and a garage, can you set up charging at home? Will we have a universal standard for charging? What exactly is fast charging?
Basically, if you’re like me, you’re a bundle of questions. Fortunately, a fellow IEEE Spectrum contributing editor is a bundle of answers.
John Voelcker has been reporting on cars and the automotive industry for almost as long as he’s been driving. He’s also a contributing editor to Car and Driver, and is the editor of Green Car Reports. His work has also been featured in Wired, Popular Science, and elsewhere. He’s an actual engineer, with a B.S. in Industrial Engineering from Stanford. And he’s our guest today.
IBM is a remarkable company, known for many things—the tabulating machines that calculated the 1890 U.S. Census, the mainframe computer, legitimizing the person computer, and developing the software that beat the best in the world at chess and then Jeopardy. The company is, though, even more remarkable for the businesses it departed—often while they were still highly profitable—and pivoting to new ones before their profitability was obvious or assured.
The pivot that people are most familiar with is the one into the PC market in the 1980s and then out of it in the 2000s. In fact, August 2020 marks the 40th anniversary of the introduction of the IBM PC. Joining me to talk about it—and IBM’s other pivots, past and future—is a person uniquely qualified to do so.
James Cortada is both a Ph.D. historian and a 38-year veteran of IBM. He’s currently a senior research fellow at the University of Minnesota’sCharles Babbage Institute, where he specializes in the history of technology. He was therefore perfectly positioned to be the author of the definitive corporate history of the company he used to work for, in a book entitled IBM: The Rise and Fall and Reinvention of a Global Icon, which was published in 2019 by MIT Press.
There’s no question that computers don’t understand sarcasm—or didn’t, until some researchers at the University of Central Florida starting them on a path to learning it.
Software engineers have been working on various flavors of sentiment analysis for quite some time. Back in 2005, I wrote an article in Spectrum about call centers automatically scanning conversations for anger—either by the caller or the service operator—one of the early use-cases behind messages like “This call may be monitored for quality assurance purposes.” Since then, software has been getting better and batter at detecting joy, fear, sadness, and confidence, and now, finally, sarcasm.
My guest today, Ramya Akula, is a Ph.D. student and a Graduate Research Assistant at the University of Central Florida's Complex Adaptive Systems Laboratory.
The most honest and inadvertently funny marketing message I ever saw was at a gas station that was closed for remodeling; it had been an Amaco station before that company was bought by BP. The sign said, “Rebranding, to serve you better.”
I’m afraid we’re a bit guilty of that here at Spectrum. This is the 30th episode of IEEE Spectrum’s relaunched podcast series, but the first under a new name, “Fixing the Future.”
We’ve changed the name partly for marketing and searchability reasons. But it also signals our intention to focus more intently on ways that technology is being deployed to improve our lives, specifically in three—to be sure overlapping—areas: climate change; machine learning and other smart technologies; and the effects of automation on the nature of work and the future of jobs.
I’m hard-pressed to imagine a more on-point guest to help me usher in this change than Myriam Sbeiti. She’s the CEO and co-founder of Sunthetics, a startup that’s reinventing the industrial processes by which we make nylon by replacing a thermal operation with an electrical one, and has both grown that business and pivoted toward other industrial processes as well.
Fixing the Future is sponsored by COMSOL, makers of mathematical modeling software and a longtime supporter of IEEE Spectrum as a way to connect and communicate with engineers.
Today’s startup invites us to rethink nuclear energy. Their plan? To put cheap, portable nuclear reactors onto barges and float them out to sea. What could go wrong? According to today’s guest, basically nothing. The reactor design avoids the type of fuel rods that gave us the fictional meltdown in The China Syndrome and the real-life ones in Chernobyl and Fukushima. In fact, my guest will claim his reactor cannot meltdown or explode.
One of these reactors would be able to supply electricity, clean water, heating, and cooling to 200 000 households. All with a carbon footprint as low as any other technology—and there are co-generation opportunities that would seem to lower it even further.
The startup is Seaborg Technologies, based in Copenhagen, and we’re lucky to have its co-founder and CEO, Troels Schönefeldt, with us today to explain how this isn’t all too good to be true.
A few months ago, we had on the show an economist who specialized in the energy sector. She noted that while the Trump administration had put drilling rights the Alaska Natural Wildlife Refuge, or ANWAR, on the block, there wasn’t much interest from the oil industry, and, more generally, the Arctic and other cold climes, presented logistical—and therefore financial—problems for oil companies.
To be sure, oil companies have been drilling in the frigid North Sea for decades, but that doesn’t mean it’s been easy. For example, at BP’s Valhall oil field in the Norwegian sector of the North Sea, drilling began in 1982, and the company is still pulling 8000 barrels per day, but losses are considerable—or have been until BP began working with a data science company. Yes, a data science company.
Further out, in the middle of the North Sea, another set of BP oil fields, known as Alvheim, has been rediscovered to have greater reserves than previously thought. There, the same data science company optimized a calibration process and in so doing reduced production losses and saved BP considerable money.
The data science company’s work isn’t limited to oil and gas. For example, it recently won a research contract with the California Energy Commission to use modeling and data analytics to help it improve production efficiencies in
wind energy.
The data science company is called Cognite, and my guest today is its Senior Director in charge of Energy Industry Transformation, Carolina Torres.
When horses were replaced by engines, for work and transportation, we didn’t need to rethink our legal frameworks. So when a fixed-in-place factory machine is replaced by a free-standing AI robot, or when human truck driver is replaced by autonomous driving software, do we really need to make any fundamental changes to the law?
My guest today seems to think so. Or perhaps more accurately, he thinks that surprisingly, we do not; he says we need to change the laws less than we think. In case after case, he says, we just need to treat the robot more or less the same way we treat a person.
A year ago, he was giving presentations in which he argued that Ais can be patentholders. Since then, his views have advanced even further. And so last summer, Cambridge University Press published a short but powerful treatise, The Reasonable Robot: Artificial Intelligence and the Law. In it, he argues that the law more often than not should not discriminate between AI and human behavior.
Ryan Abbott is a Professor of Law and Health Sciences at the University of Surrey and an Adjunct Assistant Professor of Medicine at the David Geffen School of Medicine at UCLA. He’s a licensed physician, and an attorney, and an acupuncturist in the United States, as well as a solicitor in England and Wales. His M.D. is from UC San Diego’s School of Medicine; his J.D. is from Yale Law School and his M.T.O.M.—Master of Traditional Oriental Medicine—degree is from Emperor's College.
As we begin to finally address climate change in a serious way, we need to look at our cities in a serious way. And not just first-tier cities like, well, New York, San Francisco, Seattle, and Los Angeles, and not just flashy growing cities like Las Vegas, Austin, Atlanta, and Columbus. We need to look at cities like Baltimore, Cleveland, Detroit, Philadelphia, Pittsburgh, St Louis—cities that haven’t come back from the problems—deindustrialization, disinvestment, white flight—of 50 and 60 years ago.
These cities are at a crossroads, according to my guest today. They can, he says, enjoy a comeback, stagnate, or continue to decline. There is, in fact, a unique opportunity presented by the pandemic: as working remotely becomes more widely accepted, there could be a migration to cities such as these by people not ready to give up on city life, but looking for greater affordability.
Matthew Kahn is a Distinguished Professor of Economics and Business at Johns Hopkins University; he’s the Business Director of its 21st Century Cities Initiative; and he’s co-author of a new book that addresses these questions about these very cities, titled Unlocking the Potential of Post-Industrial Cities.
I suppose it’s elitist and maybe even nationalistic of me but I was surprised to hear the phrase “resource curse,” which I associate with the developing world, used recently in a webinar in the context of a region of the United States. The region is northern Appalachia, comprising 22 counties in eastern Ohio, western Pennsylvania, and northern West Virginia. And the curse is, as it so often is in the third world, a surfeit of oil and especially natural gas, in this case extractable largely through the relatively new process of fracking.
Here to explain how the resource curse is impoverishing communities in the middle of the U.S. in the middle of the 21st century is Sean O’Leary. He’s a senior researcher at the Ohio River Valley Institute and the author of its recent report, “Appalachia’s Natural Gas Counties: How dreams of jobs and prosperity turned into almost nothing.”
In the world of prosthetics, we’re still at the stage where a person has to instruct the prosthetic to first do one thing, then another, then another. As University of Waterloo Ph.D. researcher Brokoslaw Laschowski puts it, “Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”
But Laschowski and his fellow researchers have been developing a device that uses wearable cameras and deep learning to figure out the task that the exoskeleton-wearing person is engaged in, perhaps walking down a flight of stairs, or along a street, and gets them there, a bit like programming a destination in a self-driving car.
A @RadioSpectrum1 conversation. Available on Spotify and @IEEESpectrum.
Has there been any technology more widely talked about and yet still less understood than 5G? Qualcomm’s Vice President of Engineering, Our guest, John Smee, holds dozens of patents in wireless technologies; his designs and innovations range from CDMA and LTE to Wi-Fi and now 5G. He’ll explain the challenges of 5G—and what 6G will be like. A full transcript of this and all Radio Spectrum conversations are available at https://spectrum.ieee.org/multimedia/podcasts.
If there’s one thing we can all agree on, it’s that the world is not only changing quickly, it’s changing at a faster rate than ever. Or does it just seem that way?
Surely we can all agree that the Industrial Revolution has changed everything. Or has it? One noted economist says there in fact were three industrial revolutions, and only one of them—the second one, from about 1870 to 1914, was important. In fact he largely discounts what we call the information revolution as insubstantial.
If you wanted to study the great trends and transitions of civilization—not just Western Civilization, but all of it—and break it down into epochs, and choose from the various transitions the five or seven most significant ones, and study the interplays of these transitions—which are causes of the others, and to what degree, and why some occur quickly and others—like the electric car—are postponed for a hundred years; if you wanted to do all that, it would take a lifetime of study.
In fact, you’d have to write ten or thirty books each one of which looks at some aspect of our world from a height of 30,000 feet, and then write an eleventh or thirty-first book that was the encapsulation of all that wisdom.
That certainly seems impossible. The last true Renaissance person, someone who knew pretty much all that was known at the time, might have been Aristotle, with asterisks for Franklin and Diderot, and maybe Bertrand Russell.
And yet, my guest today—who doesn’t know all that is currently known, but knows quite a bit about almost everything about technology, and the social and cultural changes that technologies have wrought, and what causes technological change itself, has done just that.
Václav Smil is a Czech-born Distinguished Professor Emeritus in the Faculty of Environment at the University of Manitoba, a part of the world we don’t always associate with the Renaissance. He’s the author of more than 40 books in an enormously wide range of fields that includes energy and food production, environmental and population change, risk and public policy, and the history of technology and innovation. He’s also a contributing editor at IEEE Spectrum.
His new book, which in some sense encapsulates all his prior scholarship, is Grand Transitions:
How the Modern World Was Made, published March 1st by Oxford University Press.
At a conference of chief technology officers in 2016, General Michael Hayden, former head of, at different times, both the NSA and the CIA, told the audience, “Cyberwar isn’t exactly war, but it’s not not-war, either.”
Cyberattacks, at the nation-state level, were already almost a decade old at that point. In 2007, over the course of 22 days a Russian attack on Estonia took out commercial and government servers, online banking, and the Domain Name System,” without which people can’t find or look up websites and online servers. The attack carried into the cyber realm an already heated political conflict between the two nations, and Estonia’s economy was as much under attack as its information infrastructure.
In 2010, we learned of the U.S.–Israeli attack on Iran and its uranium centrifuges, known as Stuxnet.
In 2015, a concerted attack, believed to have been Russian, on the power grid of another east European nation, Ukraine, left more than 200,000 people without electricity for at least several hours. It was the first attack on a grid, and perhaps the first large-scale SCADA attack—that is, on the control systems of critical infrastructure. Follow-up attacks struck the railway, television, and mining sectors.
In 2016, right around the time General Hayden was warning American audiences of the dangers of cyberwar, Russia, in conjunction with a private firm, Cambridge Analytica, and elements of the U.S. Republican party, crafted a disinformation campaign to influence the presidential election that year. Russia and Cambridge Analytica also undermined the Brexit referendum in the U.K. earlier that year.
Since then, we’ve seen entire families of malware appear, such as Trickbot. Arguably even worse was the recent SolarWinds hack, which in effect was an attack on what we might call the software supply chain. As many as 18 000 different organizations using SolarWinds may have been affected. Worse, the effects of the hack may have been reached out into other networks and therefore been exponential. For example, both Microsoft and security firm FireEye were affected, and they each have many enterprise customers.
As the fourth-century Roman poet Juvenal asked, Quis custodiet ipsos custodes? Who shall guard the guardians themselves?
A @RadioSpectrum1 conversation with Justin Cappos who heads the Secure Systems Laboratory at @NYU. On @Spotify and @IEEESpectrum https://spectrum.ieee.org/multimedia/podcasts
In the 2020 elections for the North Carolina State House, Democrats received 49 percent of the votes but won only 42.5 percent of the seats. In three-quarters of the state-level elections, the winning margin was more than 20 percentage points—in other words, landslides—even though statewide, the margins between the two main political parties is razor-thin—at the presidential level, Trump beat Biden by less than 2 percent, and a Democrat won the 2020 governor’s race. That’s gerrymandering, the process by which a state is divided up in such a way as to maximize the number of electoral seats one particular party is likely to win.
There are two ways to gerrymander. In one, you concentrate your opposition’s likely voters into a single district, giving that one away but winning all or most of the surrounding areas. In the other, you divide a concentration of likely voters into two or more districts in such a way that they’ll fall short of a majority.
Gerrymandering is obviously unfair, but creating fair districts is harder than it looks. So political operatives and consultants draw up various maps, maximizing this or that, but mostly their party’s interests.
If this seems instead like a job for computer-aided statistical analysis, it is. Several years ago, researchers in North Carolina got the idea of generating thousands—even tens of thousands—of maps, and creating algorithms that maximize the desired variables to the extent possible.
Jonathan Mattingly is a Professor of Statistical Science, and a Professor of Mathematics at Duke University He leads a group at Duke that conducts non-partisan research to understand and quantify gerrymandering.
A @RadioSpectrum1 conversation with Duke University Professor of Mathematics Jonathan Mattingly. Available on Spotify, Apple, and @IEEESpectrum.
Let’s face it. The United States, and, really, the entire world, has squandered much of the time that has elapsed since climate change first became a concern more than forty years ago.
Increasingly, scientists are warning that taking coal plants offline, building wind and solar farms here and there, and planting trees, even everywhere, aren’t going to keep our planet from heating to the point of human misery. Twenty years from now, we’re going to wish we had started thinking about not just carbon-zero technologies, but carbon-negative ones.
Last year we spoke with the founder of Air Company, which makes carbon-negative vodka by starting with liquid CO2 and turning it into ethanol, and then further refining it into a product sold in high-end liquor stores. Was it possible to skip the final refining steps and just use the ethanol as fuel? Yes, we were told, but that would be a waste of what was already close to being a premium product.
Which leads to the question, are there any efforts underway to take carbon out of the atmosphere on an industrial scale? And if so, what would be the entire product chain?
One company already doing that is Global Thermostat, and its CEO is our guest today.
Many things have changed in 2020, and it’s an open question which are altered permanently and which are transitory. Work-from-home may be here to stay; as might the shift from movie theatres and cable tv networks to streaming services; pet adoption rates are so high that some animal shelters are empty and global greenhouse gas emissions declined in record numbers.
That last fact has several causes—the lockdowns and voluntary confinements of the pandemic; an oil glut that preceded the pandemic and continued through it; the ways renewable energy—especially solar energy—is successfully competing with fossil-fuels. According to the Institute for Energy Economics and Financial Analysis, an Ohio-based non-profit that studies the energy economy, more than 100 banks and insurers have divested or are divesting from coal mining and coal power plants. Their analysis also shows that natural gas power plant projects—for example one that’s been proposed for central Virginia—are a poor investment, due to a combination of clean-energy regulations and the difficulty of amortizing big power-plant construction in the face of a growing clean-energy pipeline, expected to grow dramatically over the next four years.
Such continued growth in clean-energy projects is particularly notable, as it comes despite high job losses for the renewable energy industry, slowing construction activity, and difficulty in finding capital financing. Those same headwinds brought about a record number of bankruptcies in the fracking industry.
Our guest today is eminently qualified to answer the question, are the changes we’re seeing in the U.S. energy-generation profile temporary or permanent? And what are the consequences for climate change? Kathy Hipple was formerly an analyst at the aforementioned Institute for Energy Economics and Financial Analysis and is a professor in Bard College’s Managing for Sustainability MBA program.
Batteries have come a long way. What used to power flashlights and toys, Timex watches and Sony Walkmans, are now found in everything from phones and laptops to cars and planes.
Batteries all work the same: Chemical energy is converted to electrical energy by creating a flow of electrons from one material to another; that flow generates an electrical current.
Yet batteries are also wildly different, both because the light bulb in a flashlight and the engine in a Tesla have different needs, and because battery technology keeps improving as researchers fiddle with every part of the system: the two chemistries that make up the anode and the cathode, and the electrolyte and how the ions pass through it from one to the other.
A Chinese proverb says, “Give a man a fish, and you feed him for a day. Teach a man to fish, and you feed him for a lifetime.” The Christian Bible says, “follow me and I will make you fishers of men.”
In other words, a more engineering-oriented proverb would say, “let’s create a lab and develop techniques for measuring the efficacy of different fishing rods, which will help us develop different rods for different bodies of water and different species of fish.”
The Argonne National Laboratory is one such lab. There, under the leadership of Venkat Srinivasan, director of its Collaborative Center for Energy Storage Science, a team of scientists has developed a quiver of techniques for precisely measuring the velocity and behavior of ions and comparing it to mathematical models of battery designs.
Venkat Srinivasan [Ven-kat Sri-ni-va-san] is also deputy director of Argonne’s Joint Center for Energy Storage Research, a national program that looks beyond the current generation of lithium–ion batteries. He was previously a staff scientist at Lawrence Berkeley National Laboratory, wrote a popular blog, “This Week in Batteries,” and is my guest today.
A @RadioSpectrum1 conversation with Venkat Srinivasan of @Argonne Available on Spotify and @IEEESpectrum https://spectrum.ieee.org/multimedia/podcasts
The saddest fact about the coronavirus pandemic is certainly the deaths it has already caused and the many more deaths to come before the world gets the virus under at least as much control as, say, chickenpox.
The second-saddest fact about the pandemic is the economic and educational havoc it has wrought.
Perhaps the third-saddest fact is the unfortunate lack of agreement about the best strategies for living with the virus, which, at least in the U.S., is responsible for many of those deaths, and, arguably much of the havoc as well. It has roiled families as well as the presidential election, by politicizing the wearing of masks, the limits on gatherings, the openings and closings of restaurants and schools.
But yet another sad fact is that, as was said thousands of years ago, “there is nothing new under the sun,” and this too is nothing new; there is a shocking and unfortunate lack of widespread agreement about the best answers when it comes to many medical questions, even among doctors, because there is a shocking and unfortunate lack of evidence—and even respect for evidence—in the medical arena. That’s the contention of the authors of a rather prescient 2017 book, Unhealthy Politics: The Battle Over Evidence-Based Medicine,subtitled, How partisanship, polarization, and medical authority stand in the way of evidence-based medicine.
The Federal Communications Commission's very first cellular spectrum allocation was a messy affair. The U.S. was divided up into 120 cellular markets, with two licenses each, and in some cases, hundreds of bidders. By 1984, the FCC had switched over to a lottery system. Unsurprisingly, people gamed the system. The barriers to enter the lottery were low, and many of the 37,000 applications—yes, 37,000 applications—simply wanted to flip the spectrum for a profit if they won.
The FCC would soon move to an auction system. Overnight, the barrier to entry went from very low to very high. One observer noted that these auctions were not “for the weak of heart or those with shallow pockets.”
Cellular adoption grew at a pace no one could anticipate. In 1990 there were 12 million mobile subscriptions worldwide and no data services. Twenty-five years later, there were more than 7 billion subscriber accounts sending and receiving about 50 exabytes per day and accounting for something like four percent of global GDP.
Historically, cellular has occupied a chunk of the radio spectrum that had television transmissions on the one side and satellite use on the other. It should come as no surprise that to meet all that demand, our cellular systems have been muscling out their neighbors for some time.
The FCC is on the verge of yet another auction, to start on December 8. Some observers think this will be the last great auction, for at least a while. It’s for the lower portion of what’s called the C-band, which stretches from 3.7–4.2 gigahertz.
To sort out the who, what, when, why, and a bit of the how of this auction our guest today is Mark Gibson, Senior Director for Business Development and Spectrum Policy at CommScope, a North-Carolina-based manufacturer of cellular and other telecommunications equipment.
In 1936, after polling its readers, the Literary Digest famously predicted a landslide victory for Alf Landon. On 2 November 1948, based on widespread polling that all pointed in one direction, the Chicago Tribune famously headlined its early edition, “Dewey Defeats Truman.”
Polls have been making mistakes ever since, and it’s always, fundamentally, the same mistake. They’re based on a representative sample of the electorate that isn’t sufficiently representative.
After the election of 2016, in which the polling was not only wrong but itself might have inspired decisions that affected the outcome—where the Clinton campaign shepherded its resources; whether James Comey would hold a press conference—pollsters looked inward, re-weighted various variables, assured us that the errors of 2016 had been identified and addressed, and then proceeded to systematically mis-predict the 2020 presidential election much as they had four years earlier.
After a century of often-wrong results, it would be reasonable to conclude that polling is just too difficult for humans to get right.
But what about software? Amazon, Netflix, and Google do a remarkable job of predicting consumer sentiment, preferences, and behavior. Could artificial intelligence predict voter sentiment, preferences, and behavior?
Well, it’s not as if they haven’t tried. And results in 2020 were mixed. One system predicted Biden’s lead in the popular vote to be large, but his electoral college margin small—not quite the actual outcome. Another system was even further from the mark, giving Biden wins in Florida, Texas, and Ohio—adding up to a wildly off-base electoral college margin.
One system, though, did remarkably well. As a headline in Fortune magazine put it the morning of election day, “The polls are wrong. The U.S. presidential race is a near dead heat, this AI ‘sentiment analysis’ tool says.” The AI tool predicted the popular vote almost perfectly.
That AI company is called Expert.ai, and its Chief Technology Officer, Marco Varone, is our guest today.
If any cars are mobile phones with wheels, it’s electric cars. And just as the switch from landline phones to mobile phones was quick, and from computers to smartphones was even quicker, the shift from engines to motors, from internal combustion cars to electric cars, is starting to gain momentum and when it reaches scale, it will happen quickly.
How quickly? Pandemic aside, Tesla would be on track to sell half a million cars in 2020, all of them electric. By contrast, GM sold almost 3 million cars last year, almost none of them electric. But by 2025 or so, GM plans to sell a million electric cars, a year that the company plans to be its tipping point toward electrics.
Of course, to do that, you need amazing batteries, and an amazing capacity to produce batteries—both of which are at the heart of the company’s plans. A new GM battery factory, in partnership with LG Chem, will dwarf Tesla’s Gigafactory and power, pun intended, its drive, pun again intended, to that 2025 goal of a million electric cars.
A modern hospital operating room often has someone you never see on television: a medical device company representative. The device might be a special saw or probe or other tool for the surgeon to use. It might be an artificial hip or knee—or a mandible or a pacemaker.
The surgeon may be using the device and its toolkit for the first time. The medical device company representative often knows more about the device and its insertion than anyone on the surgical team.
Even in non-Covid times, it’s a plus to have as few people in the OR as possible. And it’s inefficient to fly these company reps around the country to advise an operation that might only take an hour.
And so, in a handful of ORs, you’ll see something else—one or more cameras, mounted strategically, and a flat-panel screen on a console, connected to a remote console. The medical device rep—or a consulting surgeon—can be a thousand kilometers away, controlling the cameras, looking at an MRI scan, and making notations on their tablet that can be seen on the one in the operating room.
It’s telemedicine for the OR, and it’s the brainchild of our guest today.
November is a big month for the millions of people who devote their time and money to computer games. Within a two-day period Sony will be releasing its fifth-generation Playstation, and its main competitor, Microsoft’s newest Xbox, comes out as well. So it’s a good month to look at the culture of gaming and how it reflects the broader culture; how it reinforces it; and how it could potentially be a force for freeing us from some of the worse angels of our nature—or for trapping us further into them. Is there anyone better to ask than Megan Condis? She a professor at Texas Tech University and is the author of the 2018 book, Gaming Masculinity: Trolls, Fake Geeks, and the Gendered Battle for Online Culture.
Engineers will tell you that for an orchestra to rehearse remotely, it would need at least 500 megabits per second to avoid throwing off the synchronicity of a concert performance. But that’s using high bandwidth as a proxy for latency. Why not work on reducing latency directly? Because it’s hard. Nonetheless, at the cutting edge of network engineering, researchers are working on it. And not for orchestras. Autonomous vehicles, factory robots, virtual reality, piloting drones, robotic surgery, and even advanced prosthetics all require latency to get down toward one millisecond. So says our guest today, Shivendra Panwar, author of an article in November’s IEEE Spectrum magazine, “Breaking the Latency Barrier.”
In Get Out the Vote, co-authors Donald Green and Alan Gerber argue that political consultants and campaign managers have underappreciated boots-on-the-ground canvassing in person and on the phone, in favor of less personal, more easily-scaled methods—radio and TV advertising, robocalling, mass mailings, and the like. Unlike campaign professionals, they base their case with real data, based on experimental research. Donald Green is a political scientist at Columbia University focusing on such issues as voting behavior and partisanship, and most importantly, methodologies for studying politics and elections.
In 2014, two Google engineers, writing in the pages of IEEE Spectrum, noted that “if all power plants and industrial facilities switch over to zero-carbon energy sources right now, we’ll still be left with a ruinous amount of CO2 in the atmosphere.”
One alternative is to stuff carbon dioxide underground. People have been talking about this for well over a decade. But just look at Exxon-Mobil’s website and see how much progress hasn’t been made. In 2015, a bunch of mostly Canadian energy producers decided on a different route. They funded what came to be called the Carbon XPRIZE to turn “CO2 molecules into products with higher added value.”
One of the more unlikely finalists emerged from the hipsterish Bushwick neighborhood of Brooklyn, N.Y. Their solution to climate change: vodka. The startup, the Air Company, takes liquified CO2 and distills it into ethanol, and then fine-tunes it into vodka. The resulting product is not only carbon-neutral but carbon negative.
In 2011, the former executive director of MoveOn gave a widely-viewed TED talk, “Beware Online Filter Bubbles“ that became a 2012 book and a startup. In all the talk of fake news these days, many of us have forgotten the unseen power of filter bubbles in determining the ways in which we think about politics, culture, and society. That startup tried to get people to read news they might otherwise not see by repackaging them with new headlines.
A recent app, called Ground News, has a different approach. It lets you look up a topic and see how it’s covered by media outlets with identifiably left-leaning or right-leaning slants. You can read the coverage itself, right–left–moderate or internationally; look at its distribution; or track a story’s coverage over time. Most fundamentally, it’s a way of seeing stories that you wouldn’t ordinarily come across.
Our guest today is Sukh Singh, the chief technology officer of Ground News and one of its co-founders.
Despite what you think, fake news is a tiny fraction of our news diet, according to Jennifer Allen, a Ph.D. student at the MIT Sloan School of Management and the MIT Initiative on the Digital Economy, and lead author of the research study, “Evaluating the Fake News Problem at the Scale of the Information Ecosystem."
Marchetti’s Constant, named after Italian physicist Cesare Marchetti, is the average time people spend on their daily commute, which is approximately a half-hour each way, all around the world. The average U.S. commute is about 27 minutes, up 8 percent from a decade earlier. But that averages people who walk 10 minutes to work with people who drive an hour; it averages people who have a quick subway ride and people taking two or three buses that run only infrequently.
But in the mind of another Italian physicist who has turned his attention and his career to transportation, we now have enough computing power—smartphones, AI, and the cloud—for a different kind of solution.
Tommaso Gecchelin, is a physicist and industrial designer. After studying quantum mechanics in Padua, Italy, and industrial design in Venice, he co-founded something called NEXT Future Transportation. For the past seven years there he has been developing a system of bus pods, one that in effect chops up a bus into car-sized pieces and has the potential to combine the best of commuter buses with the best of Uber.
The coronavirus pandemic has exposed any number of weaknesses in our technologies, business models, medical systems, media, and more. Perhaps none is more exposed than what my guest today calls, “The Hidden World of Legacy IT.”
If you remember last April’s infamous call for volunteer COBOL programmers by the governor of New Jersey, when his state’s unemployment and disability benefits systems needed to be updated, that turned out to be just the tip of a ubiquitous multi-trillion-dollar iceberg—yes, trillion with ‘t’—of outdated systems. Some of them are even more important to us than getting out unemployment checks—though that’s pretty important in its own right. Water treatment plants, telephone exchanges, power grids, and air traffic control are just a few of the systems controlled by antiquated code.
In 2005, Bob Charette wrote a seminal article, entitled “Why Software Fails.” Now, fifteen years later, he strikes a similar nerve with another cover story that shines a light at the vast and largely hidden problem of legacy IT.
Electricity is the key to modern life as we know it, and yet, universal, reliable service remains an unsolved problem. By one estimate, a billion people still do without it. Even in a modern city like Mumbai, generators are commonplace, because of an uncertain electrical grid. This year, California once again saw rolling blackouts, and with our contemporary climate producing heat waves that can stretch from the Pacific Coast to the Rocky Mountains, they won’t be the last.
Electricity is hard to store and hard to move, and electrical grids are complex, creaky, and expensive to change. In the early 20teens, Europe began merging its distinct grids into a continent-wide supergrid, an algorithm-based project that IEEE Spectrum wrote about in 2014. The need for a continent-wide supergrid in the U.S. has been almost as great, and by 2018 the planning of one was pretty far long—until it hit a roadblock that, two years later, still stymies any progress. The problem is not the technology, and not even the cost. The problem is political. That’s the conclusion of an extensively reported investigation jointly conducted by The Atlantic magazine and InvestigateWest, a watchdog nonprofit that was founded in 2009 after the one of Seattle’s daily newspapers stopped publishing. The resulting article, with the heading, “Who Killed the Supergrid?”, was written by Peter Fairley, who has been a longtime contributing editor for IEEE Spectrum.
We’re used to the idea of gold and silver being used as money, but in the in the 1600s, Sweden didn’t have a lot of gold and silver—not enough to sustain its economy. The Swedes had a lot of copper, though, so that’s what they used for their money. Copper isn’t really great for the job—it’s not nearly scarce enough—so Swedish coins were big—the largest denomination weighed forty‐three pounds and people carried them to market on their backs. So the Swedes created a bank that gave people paper money in exchange for giant copper coins.
The Swedes weren’t the first to create paper money—they missed that mark by about several hundred years. Nor will they likely be the first to get rid of paper money, though they may have the lead in that race. A few years ago, banks there started to refuse cash deposits and to allow cash withdrawals, until a law was passed requiring them to do so.
A new book about the history and future of money has just come out, imaginatively titled, Money. It’s not specifically about Sweden—in fact, those are the only two times Sweden comes up. It’s about money itself, and how it has changed wildly across time and geography—from Greek city-states in 600 B.C. to China in the eighth century and Kublai Khan in the thirteenth, to Amsterdam in the seventeenth, Paris during the Enlightenment, and the U.S. in the nineteenth century and cyberspace in the twenty-first.
It’s a wild ride that the world is still in the middle of, and it’s told in a thoroughly researched but thoroughly entertaining, and I mean laugh-out-loud entertaining, literally—I had to finish the book last night downstairs on the couch—told as a series of stories by one of radio’s great storytellers. Jacob Goldstein was a newspaper reporter before joining National Public Radio’s popular show, Planet Money, which he currently co-hosts, and he’s the author of the destined-to-be popular tome, Money, newly minted by Hachette Books.
You’re surely familiar—though you may not know it by name—with the Paradox of Choice; we’re surrounded by it: 175 salad dressing choices, 80,000 possible Starbucks beverages, 50 different mutual funds for your retirement account. “All of this choice,” psychologists say, “starts to be not only unproductive, but counterproductive—a source of pain, regret, worry about missed opportunities, and unrealistically high expectations.” And yet, we have more choices than ever— 32,000 hours to watch on Netflix, 10 million e-books on our Kindles, 5000 different car makes and models, not counting color and dozens of options.
It’s too much. We need help. And that help is available in the form of recommendation engines. In fact, they may be helping us a bit too much, according to my guest today.
Michael Schrage is a research fellow at the MIT Sloan School's Initiative on the Digital Economy. He advises corporations— including Procter & Gamble, Google, Intel, and Siemens—on innovation and investment, and he’s the author of several books including 2014’s The Innovator’s Hypothesis, and the 2020 book Recommendation Engines, newly published by MIT Press.
En liten tjänst av I'm With Friends. Finns även på engelska.