Nvidia CEO Jensen Huang donates OSU supercomputer - Protocol

2022-10-15 09:32:37 By : Mr. Andy Zhang

Jensen Huang and his wife, Lori Huang, are donating $50 million to Oregon State University to help fund the development of a new innovation complex that will include a Nvidia supercomputer.

“[W]e need to put (AI) in the hands of scientists, so they can apply it to the most important and pressing challenges,” Huang told Protocol.

Developing software and chips to tackle AI applications has for years been at the core of Nvidia’s mission, and it’s something founder and CEO Jensen Huang talks about at just about every turn.

The way Huang frames it, AI is a kind of time machine that allows scientists and researchers to effectively simulate aspects of the future, such as climate change. With a supercomputer, “something that used to take a month, now takes a day,” Huang told Protocol. “That’s a time machine. And you can see the future like you can’t possibly imagine.”

Jensen and his wife, Lori Huang, announced a $50 million donation to Oregon State University on Friday evening that will help fund a new $200 million Innovation Complex. The new complex, which will be named after Huang and his wife, will include a supercomputer built around Nvidia’s AI clusters that will be capable of training the largest AI models and performing digital twin simulations that will help researchers in climate science, materials science, and robotics, among other fields.

Protocol had a chance to catch up with Jensen and Lori Huang this week over a video conference to discuss the reasons for the donation, whether Moore’s law is truly dead, and why AI is such a crucial tech for universities to invest in.

Jensen Huang’s comments have been edited for clarity and length.

Artificial intelligence is one of the most transformative technologies that the world’s ever known. We can apply intelligence to problems at an extraordinary scale. Humans have great intelligence, but we can only read so much information and wrap that intelligence around so much data. And artificial intelligence, especially with today’s computing scale, could solve problems that no humans could possibly imagine wrapping their arms around. This instrument [AI] is available for the world’s largest technology companies that apply it for all kinds of interesting, very important problems like shopping and music recommendation and things like that.

But we need to put this technology in the hands of scientists, so they can apply it to the most important and pressing challenges. Most universities don’t have the budget. And it’s really quite a shame that most universities today still have [haven’t] come to grips with the idea that in order to advance the most important fields of science, you need a new type of instrument — just like we needed radio telescopes, just like we needed particle accelerators. We need instruments to advance science.

“[W]e need to put this technology in the hands of scientists, so they can apply it to the most important and pressing challenges.”

And in this new form, in this new world of scientific discovery, where principal methods, theoretical methods are still very important, but data-driven methods are vitally important. And this data-driven method is really about inferring from sensor information: How to predict physics, and in order to do this you need a large instrument, and that large instrument [today] is a computer, and most universities just don’t have the budgets for the scientists. They have the budget for the buildings, but they don’t have budgets for computers.

The semiconductor industry is near the limit. It’s near the limit in the sense that we can keep shrinking transistors but we can’t shrink atoms — until we discover the same particle that Ant Man discovered. Our transistors are going to find limits and we’re at atomic scales. And so [this problem] is a place where material science is really going to come in handy.

A great deal of the semiconductor industry is going to be governed by the advances of material sciences, and the material sciences today is such an enormously complicated problem because things are so small, and without a technology like artificial intelligence we’re simply not going to be able to simulate the complicated combination of physics and chemistry that is happening inside these devices. And so artificial intelligence has been proven to be very effective in advancing battery design. It’s going to be very effective in discovery and has already contributed to advancing more durable and lightweight materials. And there’s no question in my mind it is going to make a contribution in advancing semiconductor physics.

When something dies? It might be reincarnated, but it dies. The question is, what’s the definition of Moore’s law? And just to be serious, I think that the definition of Moore’s law is about the fact that computers and advanced computers could allow us to do 10 times more computing every five years — it’s two times every one and a half years — but it’s easier to go 10 times every five years, with a lower cost so that you could do 10 times more processing at the same cost.

The Huangs’ $50 million donation will help fund an AI supercomputer, seen rendered here, at Oregon State University.Photo: Oregon State University

Nobody actually denies it at the physics level. Dennard scaling ended close to 10 years ago. And you could see the curves flattened. Everybody’s seen the curves flatten, I’m not the only person. So the ability for us to continue to scale 10 times every five years is behind us. Now, of course, for the first five years after, it’s the difference between two times and 10 times — you could argue about it a little bit and we’re running about two times every five years. You could argue a little bit about it, you can nip and tuck it, you could give people a discount, you could work a little harder, so on and so forth. But over 10 years now, the disparity between Moore’s law is 100 times versus four times, and in 15 years, it’s 1,000 times versus eight.

We could keep our head in the sand, but we have to acknowledge the fact that we have to do something different. That’s what it’s really about. If we don’t do something different and we don’t apply a different way of computing, then what’s going to happen is the world’s data centers are going to continue to consume more and more of the world’s total power. It’s already noticeable, isn’t that right? It means the moment it gets into a few percent, then every year after that, it will [continue]. Every five years it will increase by a factor of 10.

So this is an imperative. It’s an imperative that we change the way we compute, there’s no question about it. And it’s not denied by any computer scientists. We just have to not ignore it. We can’t deny it. And we just have to deal with it. The world’s method of computation cannot be the way it used to be. And it is widely recognized that the right approach is to go domain by domain of application and get accelerated with new computer science.

The basic science thing and the reason for [our donation] is because people see a different future. And number one: Some people think that climate science and climate change is a real problem. Some people don’t. People also see the solutions differently. We need a time machine [a supercomputer] — we need a simulation. We need a method to predict the impact of climate science and the magnitude of impact in different regions around the world.

“We could keep our head in the sand, but we have to acknowledge the fact that we have to do something different.”

We can do this. It’s within the capabilities of our technology, within the capabilities of our time to simulate to predict the impact of climate change in different regions around the world. So that we can answer the question, what does climate change mean to me? What does climate change mean to an Oregonian? What does climate change mean to an Australian? What does climate change mean to a Venetian? What does climate change mean to somebody living in Southeast Asia next to the Mekong River? What does climate change mean to somebody who lives in Northern California where we have so many, so many wildfires? We need to be able to answer the question, what does it mean?

Number two: We also need a simulator that allows us to simulate scenarios so that we can predict the impact or the mitigation strategies that we have and which ones we use first. Some mitigation strategies have great potential but have side effects. Every single mitigation strategy has side effects. We need to be able to simulate its impact as well as its side effects to understand the net benefit to mitigating climate change. And so there are all these types of questions that we would love to be able to answer. But we need amazing climate scientists and to give them the right instruments, the right tools — a time machine so that they can go into the future and explain and bring back the answer to us.

Max A. Cherney is a senior reporter at Protocol covering the semiconductor industry. He has worked for Barron's magazine as a Technology Reporter, and its sister site MarketWatch. He is based in San Francisco.

Companies and countries alike are promising an EV revolution over the next decade.

Automakers and governments have set varying targets to phase out the sale of gas-powered vehicles.

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Sarah (Sarahroach_) writes for Source Code at Protocol. She's based in Boston and can be reached at sroach@protocol.com

The electric vehicle transition is already underway, but it will need to accelerate to keep the Paris Agreement’s goal of limiting global warming to 2 degrees Celsius in reach.

To speed things up, automakers and governments have set targets to phase out the sale of gas-powered vehicles. Those targets, though, vary country to country and automaker to automaker.

And it will require more than just making EVs to meet those targets. Installing robust and accessible charging infrastructure and reducing the environmental impact of critical mineral extraction for EV batteries are just some of the attendant issues to making aspirational goals a reality.

It’s a lot to navigate. To make it easier, we broke down the different goals as well as the challenges that stand in the way of more widespread EV adoption.

BMW wants 50% of global sales to be electric before 2030. The German automaker set an interim target of selling 2 million EVs by 2025.

Ford is planning for half of all vehicles it sells to be electric by 2030. Like BMW, the company has also set interim goals. By the end of 2023, Ford wants to produce 600,000 EVs a year. By 2024, it hopes to manufacture 270,000 Mach Es a year for North America, Europe, and China; produce 150,000 Lightnings in North America and 150,000 electric Transit vans for North America and Europe; and sell 30,000 units of a yet-to-be-made electric SUV in Europe.

General Motors committed to selling only zero-emission cars and trucks by 2035. Easy peasy.

Honda aims to make its entire lineup zero-emissions in major markets by 2040. The company wants to offer 30 EV models by 2030, and crank out more than 2 million EVs a year. Honda is working on three new EV platforms for its models, one of which is in partnership with GM.

Hyundai plans to sell 1.9 million battery EVs annually by 2030, and will introduce 17 new EV models by then. That would equal 7% of the global market. Meanwhile, Kia — which is owned by Hyundai — wants to boost annual sales of battery EVs to 1.2 million by the end of the decade.

Mazda pledged that 25% of its vehicles will be electrified in 2030, while the rest of its offerings will be hybrids. The company just rolled out its first EV and it has plans to launch three new EVs by 2025.

Nissan wants EVs to make up at least 75% of its sales in Europe, 55% in Japan, and 40% in China by fiscal year 2026. By fiscal year 2030, it wants 40% of its U.S. sales to be EVs. It’s planning to introduce 23 new electrified models, including 15 new EVs, by the end of the decade.

Stellantis — the parent company of Dodge, Jeep, Chrysler, and other brands — is planning to only sell EVs in Europe by 2030, while half of all sales in the U.S. will be EVs by then. The company plans to offer more than 75 EV models and sell 5 million EVs annually around the world by 2030.

Subaru wants EVs to make up 40% of its sales by 2030, though it does have some catching up to do. The company launched its first-ever EV last November 2021.

Toyota expects its sales of all-electric vehicles to reach 3.5 million by 2030, and will introduce 30 EV models by that time. The company has sold millions of partially electrified vehicles, including the Prius, but it only introduced its first widely available all-electric car this year.

Volkswagen has committed to designing its last combustion engine platform in 2026, though it will still sell gas-powered cars after that. Still, it plans for half of all vehicles sold in the U.S. and China and 70% of all vehicles sold in Europe to be electric by 2030. The company aims for nearly all vehicles sold in all markets to be zero emissions by 2040.

Volvo is aiming for 50% of all car sales to be electric by 2025, with a long-term goal to be a fully electric car company by 2030. To get there, Volvo plans to put an electric motor — hybrid or otherwise — in every new car it launches from 2019 onwards.

Many countries have set electric and zero-emissions vehicle sales targets, including some of the biggest auto markets. They’ve also laid out charging infrastructure plans in an effort to make EVs more accessible to the masses.

The U.S. wants half of all new vehicles sold to be zero emissions by 2030 — and a network of 500,000 chargers to make that possible. Some states, such as California and New York, have set even more aggressive goals.

The U.K. intends to end the sale of new gas-powered vehicles by 2030 and hopes for all new cars and vans to be fully zero emissions by 2035. In March, the country pledged 1.6 billion pounds to help build a nationwide network of 300,000 charging stations.

The European Union voted to ban the sale of new internal combustion engine vehicles by 2035 earlier this year.

China has a goal for EVs to make up 40% of cars sold by 2030. The country ultimately wants to achieve carbon neutrality before 2060, meaning there will be no net release of carbon dioxide in the atmosphere. The Chinese government has been encouraging the adoption of EVs since 2009, when it offered subsidies for EV purchases. Those subsidies phased out in 2020.

Canada has set a mandatory target to end the sale of new gas-powered vehicles by 2035. The country is also spending $680 million through 2027 to build out its charging network.

Japan aims for all new cars to be electric by 2035. But after pushback from the CEO of Toyota, the country emphasized its support for hybrid vehicles in this as well. In June, it also pushed to remove a 50% zero-emission vehicle target from a G7 statement.

EVs are hands-down better than gas-powered cars for the climate. But the coming boom for critical minerals could cause environmental harm. There are also risks of unfair labor practices and an uneven transition that leaves low-income communities and emerging economies behind, owing to EVs’ higher upfront prices. But all isn’t lost.

Mining with a light footprint can help build more sustainable EVs. Most current methods for digging up critical minerals require copious amounts of water. There are also concerns that mining could wipe out endangered species, an issue that’s central to the fight over a proposed Nevada lithium mine. Some countries even want to mine the seafloor, which may be more problematic because of how much we don’t know about the impacts. There are a few solutions at hand, though they’re still nascent, including extracting lithium from brine associated with geothermal energy.

Recycling could play a role. Batteries are increasingly valuable commodities, and companies are coming up with ways to recycle them, thus reducing waste (and strain on the supply chain). Several companies including Redwood Materials, Li-Cycle, and Ascend Elements are already beginning to recycle batteries, and Ford and Volvo have partnered with Redwood on its recycling program.

Ensuring fair labor practices is also an important part of the transition to EVs. Unionization — already a staple at traditional automakers — could help ensure a fair transition. Setting up stronger policies around importing critical minerals dug up without forced labor could also improve mining conditions abroad.

EVs have to be accessible to everyone. The Inflation Reduction Act includes tax credits for used EVs for low- and middle-income people. That’s a start, as is the Biden administration’s commitment to ensuring that 40% of federal climate funds benefit disadvantaged communities. Despite that, some communities of color are already running into issues with how states are using federal charging funds appropriated under the bipartisan infrastructure law. It’s clear there needs to be more work done to lower the barriers to entry.

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Today, companies across the world are facing unprecedented uncertainty. Consequences of the global pandemic, ongoing trade concerns and political conflicts have disrupted business operations, which has, in turn, exacerbated existing workforce issues, created supply shortages, and made demand forecasting and customer engagements more complex. How are businesses expected to thrive in this world order? According to a new report, the answer lies in the power of automation to stabilize workforces, drive economic growth, and build business resilience. Introducing the Automation Economy.

The Automation Economy—the focus this week at Imagine, and in response to Automation Anywhere’s third edition of the Automation Now & Next report—will accelerate how businesses scale automation and sustain performance. Of the 1,000 global organizations surveyed in the report, more than a third indicated automation will lead them out of global crises.

“Today’s business leaders must look beyond their current business processes and imagine how automation can enable them, and others, to make bolder moves and reimagine work,” says Mihir Shukla, CEO and co-founder of Automation Anywhere. “The reality is we just don’t have enough knowledge workers to do the work, and there’s much more work to be done. It doesn’t matter what you produce, but more importantly, how you are going to get the work completed and deliver the product to your customers?”

A fireside chat with Automation Anywhere youtu.be

For certain sectors, intelligent automation is a must-have, not just a nice-to-have. In financial services, automated processes can include loan payment management, car loan applications, bank account management and much more. In a case study published by Automation Anywhere, one data firm needed data to be converted from one system to another. The projected time for a vendor to finish this process was two years, but the migration was completed in just 12 weeks with automation and bots running 24/7.

In healthcare, automation can improve patient outcomes by supporting medical advancements, managing patient intake, scheduling, claims and billing, freeing staff to ensure patients get the care they need. In retail, automation services can make ERP and supply chain processes more-efficient, and can include creating and disseminating reports, clearing invoices, and checking payment status against service-level agreements (SLAs).

The C-suite views automation as a vital tool in the business toolbox that can revitalize their workforce and improve employee retention. After all, if workers don’t have to focus on routine manual tasks, they can be more engaged with other aspects of their job. In the Automation Anywhere report, around 40% of survey respondents believed that more than half of all employees could benefit from even just a single bot to help them in their daily work routine.

Also, a whopping 94% of respondents said moving employees to higher-value work is a top priority for the coming year.

For nearly twenty years, according to Shukla, he has been on a mission to unleash human potential by helping every company in every sector across the globe build a digital workforce and succeed with automation.

Teaming up with a digital coworker is par for the course for businesses seeking to address key challenges, but it is also useful as a strategy to interest employees with a new kind of colleague. At Automation Anywhere, they are using hundreds of digital coworkers internally in multiple departments. “Our employees aren't just more productive with bots — they are happier,” says Shukla. “Employees and customers have quickly come to not only rely on their digital workers but to engage with them, giving them friendly nicknames and wanting to communicate with them in a more personal way.”

Shukla goes on to say that Automation Anywhere is delivering on that promise for customers. “When we empower human workers to offload manual tasks to automation, we unleash their potential to pivot to the next big idea, build deeper customer relationships and drive business growth.”

That is a future many business leaders are embracing to attain a competitive advantage.. A quarter of respondents in the Automation Anywhere report said they are escalating automation funding by at least 25% to help speed up automation deployments. Sitting on their hands simply isn’t an option any longer, especially as more companies focus diligently on building a resilient workforce buttressed by both human and digital workers.

Digital transformation continues to accelerate at a rapid pace across enterprise businesses, and it can be overwhelming to adapt to an ever-evolving culture of technological change. But to drive growth, embracing the Automation Economy can be a harbinger of positive outcomes ahead. Business leaders can continue to help run current operations with the status quo model, or they can choose the bold and rewarding path of making calculated bets and exploring new technologies and solutions to scale automation across the company.

By focusing on underserved industries, vertical SaaS companies like ServiceTitan, Toast, and Procore are taking market share from the cloud giants.

It’s becoming increasingly clear that you don’t have to sell to everyone to win.

Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She's based in Los Angeles and can be reached at acounts@protocol.com.

There’s an age-old idea in the software business that if you want to be big, you have to build a company that can meet the needs of every enterprise. But a litany of enterprise software companies, from Veeva and Procore to Toast and ServiceTitan, are turning that thesis on its head by relentlessly focusing on niche industries — and winning.

One by one, vertical software companies have gone public at eye-popping numbers: Veeva at $2 billion back in 2013, Procore at $8.5 billion last year, then Toast at $20 billion a few months later.

The success of vertical software has pushed SaaS giants from Microsoft to Google to Salesforce to launch industry-focused clouds across manufacturing, health care, financial services, and others. It’s an acknowledgement by those companies that in order to compete and win against the next generation of startups, they’ll need to deepen their expertise.

The SaaS giants know that by going broad, there will be some customer needs they can't meet. But the enterprise software industry is also cyclical, and at some point the specialists may decide they need to go broad to scale.

But right now, the rise of vertical SaaS is yet another pendulum swing in the everlasting struggle between platform players and specialists.

The enterprise software industry, led by SAP, Oracle, and Microsoft, thrived for decades on the notion that the best way to win was by expanding your total addressable market as widely as you could. That meant capturing customers across industries and markets to grow at blistering pace, which worked when software was relatively new to the captains of industry.

Although the horizontal software play has been a narrative dominating enterprise tech, a growing list of vertical software companies such as ServiceTitan, Procore, and Toast are upending that narrative and shifting market sentiment.

“The market, or at least investor interest in vertical SaaS, has dramatically increased over the past few years,” said Talia Goldberg, a partner at Bessemer Venture Partners, which has been investing in vertical software for more than a decade.

One of those companies, ServiceTitan, found success by building a software platform that spans marketing, human resources, and finance for plumbers, electricians, and other trade businesses.

ServiceTitan co-founder Vahe Kuzoyan didn’t set out to start a company, but when he noticed his dad’s plumbing software was outdated and couldn’t find an alternative in the market, he decided to build the software himself. That software turned into ServiceTitan, which is now worth an estimated $9.5 billion.

But it wasn’t always easy to convince investors that there was a market for software that catered to trade industries, because field services businesses were often overlooked by investors and an industry-specific focus was viewed as limiting by venture capitalists obsessed with growth.

“The orthodoxy at that time was, you draw a box around a category of software and then you do really well in that category and then you go try to sell it to as many customers as you can,” said Kuzoyan.

As ServiceTitan grew in revenue and traction, its financial performance dimmed some of that skepticism, but its story isn’t unique.

“What [investors] missed is that you can capture significant market share in [a] vertical much more so than in any horizontal industry,” said Bessemer’s Goldberg. In the CRM space, for example, Salesforce is dominating the market with about 30% market share. But “in vertical software you can credibly get to 50% plus market share,” she said.

The reason vertical software companies are able to capture so much market share is they’re often building software for underserved or complex industries that they can understand more deeply than a bigger software company.

That’s especially true in industries such as restaurants, health care, construction, or financial services; customers want software that can help them keep up with ever-shifting regulatory environments, complicated sales processes, and unique business models.

But these are the exact types of industries that have traditionally been ignored by large software vendors.

“Despite being one of the largest industries in the world, restaurants have been underserved by technology,” said Chris Comparato, CEO of restaurant software vendor Toast. In response, restaurants have been forced to stitch together software designed for other businesses or default to manual processes to serve their needs.

“Restaurants have been plagued with a constellation of bolted-together point solutions, manual workflows and workarounds, and horizontal software providers with generic solutions who didn’t appreciate the complexity of their business,” said Comparato.

What’s great about a vertical solution is it speaks directly to the customer in the language they understand.

Vertical tools, on the other hand, solve many of these challenges by providing exactly what specific customers need and serving as the central nervous system for the businesses they serve.

But beyond the tools themselves, another advantage for vertical SaaS companies is the expertise of their sales and customer service teams.

At Toast, for example, nearly two-thirds of employees have restaurant industry experience, while ServiceTitan and Procore also employ a significant number of people with backgrounds in the trades or construction respectively.

That enables employees to address customer questions and problems in a way that employees at the cloud giants probably couldn’t.

“What’s great about a vertical solution is it speaks directly to the customer in the language they understand,” said Wyatt Jenkins, senior vice president of product at construction software company Procore. In other words, “we authentically speak the language of construction,” he said.

As more vertical software companies have found success, established vendors such as Salesforce, Microsoft, and Google have also entered the fray, launching industry-focused versions of their cloud products.

“I think it's very telling,” said ServiceTitan’s Kuzoyan. “To me it's actually reinforced the broader thesis that the future is vertical. Otherwise they would be creating best-in-class categories that you could then mix and match. That’s not where most of the conversation is.”

But are companies like ServiceTitan, Procore, or Toast worried about the SaaS giants moving into their space? Not really, said Kuzoyan, who noted that he loves going up against Salesforce.

That’s because vertical software companies know large SaaS giants don’t have the expertise or knowledge to meaningfully compete in their industries. And the truth is, it just doesn't make financial sense for horizontal players to home in on any one industry.

“They’re trying to appeal and wrap a new cover on top of their offering and make a few small tweaks to serve those customers, but in reality, they're not going and building all the tiny little features and labels and completely changing the whole workflows because it's not worth it,” said Goldberg.

That doesn’t mean it’s all smooth sailing for vertical software companies.

ServiceTitan co-founder Vahe Kuzoyan Photo: ServiceTitan

While ServiceTitan’s Kuzoyan estimates the market for trade business software at nearly $1 trillion, and Procore’s Jenkins said the construction industry accounts for 13% of global GDP, that potential isn’t true of all industries.

“One of the other challenges with vertical software is you’ve got to find these deep verticals that have that potential, because otherwise you're going to close off the size of markets you can go to,” said Jenkins.

And because vertical software companies are playing across fewer industries, they often have to be the leading software provider in their space in order to be a breakout success. “Whereas the No. 2 in the CRM market or the HR software market is a pretty exciting place to be … in the vertical SaaS ecosystem, the size of the prize for the No. 2 is substantially smaller,” said Goldberg.

Although it’s still early in the vertical software market, the potential is enormous. “If you look at the universe of vertical SaaS businesses, it's a relatively nascent field; it's not very mature,” said Kuzoyan.

While companies like Toast, Procore, and Veeva have already gone public, others such as ServiceTitan haven’t yet. While ServiceTitan wouldn’t disclose any plans to go public, Insider reported that the company confidentially filed for an IPO earlier this year.

Over the coming years, we can expect more vertical software companies to emerge, go public, or even be acquired by some of the horizontal SaaS giants.

Despite the recent downturn, there are opportunities in the public markets, which have been favorable to vertical SaaS players historically: private equity buyers that are interested in vertical software and also M&A opportunities, said Goldberg. “I think the opportunities and the breadth of opportunities for vertical software companies is just as large as it is for any horizontal SaaS company today. There’s no real difference.”

As industry-focused companies continue to prove they can compete with the SaaS giants, and as the SaaS giants themselves move into industry territory, it’s becoming increasingly clear that you don’t have to sell to everyone to win.

The big platform players can do a lot of things, but as the saying goes, if you’re a jack of all trades, you’re a master of none.

Correction: This story was updated on Oct. 14, 2022, to reflect ServiceTitan's most recent valuation.

Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She's based in Los Angeles and can be reached at acounts@protocol.com.

The Small Business Administration will consider lifting a decades-old moratorium on who can lend its government-backed loans.

The change could open the door for fintechs to write loans backed by the SBA 7(a) program.

The Biden administration's efforts to help small-business owners get better access to capital could open up a big opportunity for fintech lenders.

The Small Business Administration will soon propose a rule change that could lift a 40-year moratorium on new licenses for nonbanks — including fintechs — to lend through its largest loan program. The plan was revealed last week in a list of policy initiatives from Vice President Kamala Harris aimed at advancing racial equity in small-business ownership.

The change could open the door for fintechs to write loans backed by the SBA 7(a) program, a roughly $35 billion annual program that offers loans up to $5 million to small businesses, backed up to 85% by the federal government.

The program is mostly limited to depository institutions or banks. Some non-depository lenders can write the loans through a special license overseen by the SBA. But the number of “small business lending company” licenses has been capped at 14 since 1982, meaning lenders that wish to participate must either bid for one of those licenses or partner with a bank on the loans.

The 7(a) loans are designed to serve business owners that struggle to get other types of financing, but data shows long-standing disparities in the loans based on race and income. Harris' announcement on Oct. 4 said the administration hopes that having more lenders will make the loans more accessible, "particularly in smaller-dollar and underserved markets, where borrowers are most acutely shut out of” lending.

“For too long, the small business ecosystem in underserved communities has struggled to keep up with better funded businesses and entrepreneurs in more prosperous communities,” Harris said.

Fintechs believe they can help. A study of lending data from Funding Circle and LendingClub published last month by the Bank for International Settlements found fintech lenders had the potential to allow “small businesses that were less likely to receive credit through traditional lenders to access credit and to do so at lower cost.”

"The fintech industry is often serving minority-owned, low- to moderate-income, and the smallest of small businesses," said Ryan Metcalf, head of public policy and social impact at Funding Circle. "That's the population the SBA is struggling to reach through banks."

When the Paycheck Protection Program was created in response to the economic hardship brought on by the pandemic, the SBA cleared fintech lenders to originate loans for the program. An analysis by the Federal Reserve Bank of New York found that fintech lenders "likely served borrowers who would not have received loans otherwise," often because they lacked existing banking relationships. About 1 in 4 Black-owned firms applied to fintech lenders, more than twice the rate of white-, Asian-, and Hispanic-owned firms, according to the New York Fed.

"If we're serious about expanding access to capital for those business owners and entrepreneurs who have historically lacked such access — and that is part of the original purpose for SBA funding support programs — then we should widen the scope of who's able to participate," said Dane Stangler, director of strategic initiatives at the think tank Bipartisan Policy Center.

The Bipartisan Policy Center has convened a panel of bankers, fintech leaders, and small-business owners to study how the SBA can best serve small-business owners. While that panel has not yet put out any formal recommendations, the BPC supported a bill last year from Sens. Tim Scott and John Hickenlooper to lift the moratorium on new SBA lending licenses.

But while fintech companies were credited with helping more businesses access PPP loans, researchers found that some of those fintechs were responsible for a significant share of fraudulent loans — which could weigh on the decision to allow further expansion of SBA-backed loans to nonbank lenders.

"There are good actors and bad actors in the fintech ecosystem," Stangler said. "But we definitely think that this particular step is something that should be considered. Very carefully crafted, because it has some significant regulatory implications, but genuinely considered if our goal is to expand access to capital."

One major question will be how the SBA structures the change. No rule has been proposed yet and an SBA spokesperson declined further comment on when to expect one. Bill Briggs, a former SBA official, told Inc. that the process could take up to a year from when the rule is first proposed.

If the rule change is approved, fintech companies will still need to weigh the compliance costs and resources needed to pursue a lending license. Metcalf said there is strong demand both from small businesses and investors to fund the loans, particularly for fintechs that can help reach more business owners and offer easier ways to run through the application process.

“There are additional levers to be pulled to reach more populations,” Metcalf said. “Increasing the distribution channels is a step in that direction.”

MIT Energy Initiative’s Howard Herzog explains why the number is unrealistic.

Lowering the cost per ton for carbon dioxide removal is critical to ensuring the industry is economically viable.

Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.

$100 per ton is the carbon dioxide removal industry’s standard-bearing metric. It’s the target identified by both Frontier’s well-respected advance purchase commitment and the Department of Energy’s Carbon Negative Shot for ensuring CDR is scalable.

Experts agree that we need to remove billions — possibly many billions — of tons of carbon dioxide from the atmosphere to have a decent shot at achieving net zero by midcentury. CDR at that scale would be enormously costly, so lowering the cost per ton is critical to ensuring the industry is economically viable.

The $100-per-ton target is what economists, policymakers, investors, and the industry itself generally agree makes CDR feasible at scale. According to a survey of CDR stakeholders from CarbonPlan, there isn’t consensus on what that number means. Some view it as a break-even point for sellers, and others refer to it as a post-incentive price for buyers. But for the industry at large, it’s an accepted — and achievable — target.

Most techniques that reliably pull carbon from thin air currently cost much more than that. “$100 per ton is an extremely ambitious 10-year target, likely probably more of a 15- to 20-year target,” Carbon180 senior visiting scholar Shuchi Talati told Protocol, adding that it’s “important to be ambitious” and “there’s a lot of momentum around CDR and getting these technologies to scale.”

Yet ambition and momentum may not be enough to reach that milestone, according to Howard Herzog, a senior research engineer with MIT’s Energy Initiative. He’s been studying carbon capture for over 30 years (even writing a book about it in 2018) and is more skeptical given the capital costs to build CDR plants and the enormous amount of energy they need to run. He sat down with Protocol to talk about why he thinks $100 per ton is “pure fantasy.”

This conversation has been edited for brevity and clarity.

Why do you think $100 per ton is an unrealistic target?

Carbon dioxide is so diluted in the air that in order to capture it, almost irrespective of what process you use, you’re going to have to push a lot of air through these machines, and that means a lot of capital costs and a lot of energy spent.

Estimates put the energy requirement at $1,200 a kilowatt-hour per ton of carbon dioxide. The cost of electricity here where I live in Massachusetts is 20 cents a kilowatt-hour. Europe is pushing up prices to 40 cents. And this energy has to be carbon-free. Very few places have carbon-free electricity, but let’s say you can do it for 10 cents a kilowatt-hour, which I think is really stretching it — that’s $120 per ton of carbon dioxide.

That’s before you even start including the capital cost, which is significant. You need larger machines to process all that air. You want to put the air through these machines at a certain rate. And because of that, it’s going to be a large capital cost. Just looking at that, $100 or even $200 per ton just doesn’t pass the smell test.

So what do you think is a more realistic minimum cost for carbon dioxide removal?

Basic physics and engineering say there are some minimum requirements, and when you look at the most optimistic situation, my estimate for where we might be at is $600 to $1,000 for 2030.

Isn’t it possible to get capital costs down with scale?

There’s some truth to that, but it’s one thing to get it down by 40% or even cut it in half, but getting it down by an order of magnitude is a whole other dimension.

If you have a technology that can give you unlimited carbon removal at $100 a ton, that’s nirvana. We’re done, we’ve solved that problem.

There are also capital costs that go up as they scale. There was trouble with the Climeworks installation in Iceland last winter because of the temperature. [Editor’s note: Herzog is referring to the company’s plant, which dealt with frozen machinery last winter. Climeworks head of climate policy Christoph Beuttler said it was a “very good example of how important it is to deploy now and to get the experience.”] We saw this in Texas when everything broke down in cold temperatures; they didn’t spend money to weatherize it, so that adds cost. You’re putting things out, you want them to run at least 20 years. To do that, you have to harden them to stand up to the elements. It’s one thing to make a small demonstration, but when these things mature, some things will raise costs, other things will lower costs.

On the energy front, isn’t it possible to get costs down with the expansion of renewables?

Even if it’s several cents per kilowatt-hour, [renewables are] still intermittent, and you need this to run 24/7, which has a whole bunch of other costs. Say I just buy a lot of batteries so I can have this running all the time: That's going to cost more than the original wind farm in the first place. On the grid, there’s still the backup problem and the peak problems. When you start putting more and more renewables on the grid, these system costs become more important. So if renewables are only 5% of my energy, there’s not a lot of integration costs; they’ve been absorbed pretty well. When you start getting up to 30[%] to 50% renewables, these costs start becoming much more significant.

Why do these machines have to operate 24/7?

Running these machines costs a lot of money. If I’m running 24/7 and capturing 1,000 tons of carbon a year, OK. If I’m only running half the time, capturing 500 tons a year, the dollar per ton just doubled. The capital cost is still the same. That’s the problem with all of these capital-intensive processes: You need them to operate a significant amount of the time. Usually you shoot for 85[%] to 90% of the time.

$100 per ton is just a target to aspire to. What’s wrong with that?

I just like to deal with facts. I think it’s disingenuous. If you’re really interested in solving climate change, you’ve got to level with people.

Estimates from the U.N. and other sources say that if we want to get to where we need to be, we may need to remove 10 billion tons of carbon dioxide a year by midcentury. Do you agree with that?

It’s at minimum a few billion tons a year, because there are certain sectors that are really hard to decarbonize. A part of that depends on how expensive you think these different sectors are to do, and then how expensive you think the offsets are going to be.

It’s very frustrating. When people think things are too easy, they won’t address the hard decisions, even though those hard decisions may end up with a better solution.

Say I want to decarbonize airplane biofuels and that costs me $700 per ton. If I can capture carbon from the air for $500 per ton, why not just keep emitting the carbon dioxide out of the airplane and capture the air to offset it? And it’s cheaper by $200 per ton. So that’s the driving force. And so I would say that even direct air capture at $500 a ton will have a benefit. For offsets like DAC, they’re going to be more effective for anything that costs more than their price. And so you have to look at the whole system.

It gets more complicated, because there are quite a few negative emissions technologies. DAC isn’t the only one. And none of them are unlimited in their application. If you have a technology that can give you unlimited carbon removal at $100 a ton, that’s nirvana. We’re done, we’ve solved that problem.

If $600 to $1,000 per ton is the likely cost of CDR, what role do you think it will play to get to net zero?

The question is, will there be other, cheaper offsets than that? Every offset has problems. Offsets from bioenergy with carbon capture and storage are cheaper and much more doable. The big issue is the biomass feedstock: how much there is and what the cost will be. Another option, one that I really like, is called liming the ocean. But politically, it’s a nightmare. Think about throwing a chemical in the middle of the ocean. Just think of the protest. But even today, by putting carbon dioxide in the atmosphere, most of that ends up in the ocean.

It’s very frustrating. When people think things are too easy, they won’t address the hard decisions, even though those hard decisions may end up with a better solution. At this point, I don’t know if liming the ocean is a great idea or not, but it has a lot of potential, and we have to look at things like that if we want to get to net zero. And people say capturing carbon dioxide from the air for $100 per ton will get us to net zero. But if it’s a fantasy, it’s not going to get us there.

Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.

To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.