Lightmatter’s photonic AI ambitions light up an $80M B round – TechCrunch 1

Lightmatter’s photonic AI ambitions light up an $80M B round – TechCrunch

AI is fundamental to many products and services these days, but the hunger for data and computing cycles is bottomless. Light matter plans to skip Moore’s law with its ultra-fast photonic chips that specialize in AI work, and with a new $ 80 million round, the company is ready to bring its light-powered computers to market.

We first covered Lightmatter in 2018 when the founders were fresh from MIT and raised $ 11 million to prove that their photonic computing idea was as valuable as they claimed. They spent the next three years changing the way technology was built and evolved – encountering all the hurdles hardware startups and tech founders tend to find.

For a complete look at the company’s technical functions, read this feature – the gist hasn’t changed.

In short, Lightmatter’s chips quickly perform certain complex calculations that are fundamental to machine learning. Instead of using charge, logic gates, and transistors to record and manipulate data, the chips use photonic circuits that do the calculations by manipulating the light path. It has been possible for years, but until recently it worked, and for some practical purpose, very valuable indeed, it did not work.

Prototype to the product

It wasn’t entirely clear in 2018 when Lightmatter would be launched, whether this technology could be sold to replace more traditional computer clusters like the thousands of custom units that companies like Google and Amazon use to train their AIs.

“We basically knew the technology was supposed to be great, but we had to figure out a lot of details,” said CEO and co-founder Nick Harris in an interview with TechCrunch. “We had to overcome many difficult theoretical challenges in the areas of computer science and chip design … and COVID was a beast.”

With suppliers out of order and many in the industry pausing partnerships, delaying projects and other things, the pandemic brought Lightmatter months behind schedule, but they came out stronger on the other hand. Harris said the challenges of building a chip business from the ground up were significant, if not unexpected.

A rack of Lightmatter servers.

Credit: Light matter

“In general, what we do is pretty insane,” he admitted. “We build computers from nothing. We design the chip, the chip package, the card on which the chip package is located, the system in which the cards are inserted and the software that runs on it. We had to build a company that encompassed all of this expertise. “

The company has grown from a handful of founders to more than 70 employees in Mountain View and Boston. The growth will continue when it launches its new product.

Where a few years ago the Lightmatter product was more of a well-informed wink, it has taken on a more solid form in Envise, which they refer to as the “all-purpose photonic AI accelerator”. It is a server unit that fits into normal data center racks, but is equipped with several photonic computing units that can perform inference processes for neural networks at dazzling speed. (It’s currently limited to certain types of calculations, namely linear algebra rather than complex logic, but that type of math happens to be a major part of machine learning processes.)

Harris was reluctant to give precise figures on performance improvements, but more because those improvements are growing than they are not impressive enough. The website suggests it is 5x faster than an NVIDIA A100 device on a large transformer model like BERT, using around 15 percent of the power in the process. This makes the platform doubly attractive for AI giants like Google and Amazon, who constantly need more computing power and pay through the nose for the energy required to use them. Better performance or lower energy costs would be great – both together are irresistible.

It is Lightmatter’s original plan to test these units with their most likely customers by the end of 2021, refine them, and bring them to production levels so they can be widely used. Harris stressed, however, that this is essentially the Model T of their new approach.

“If we’re right, we’ve just invented the next transistor,” he said, and for large-scale computing purposes the claim is not unfounded. They won’t have a miniature photonic computer in hand anytime soon, but in data centers, which are expected to consume 10 percent of the world’s energy by 2030, “they really have unlimited appetites.”

The color of math

A Lightmatter chip with a logo on the side.

Credit: Light matter

There are two ways that Lightmatter intends to improve the capabilities of its photonic computers. The first and craziest sound is the processing in different colors.

It’s not that wild when you think about how these computers actually work. Transistors, which have been at the heart of computing for decades, use electricity to perform logical operations, open and close gates, and so on. On a macro scale, you can have different frequencies of electricity that can be manipulated like waveforms, but it doesn’t work that way on this smaller scale. They only have one form of currency: electrons and gates are either open or closed.

However, in Lightmatter’s devices, light travels through waveguides that do the calculations during the process, which (in some ways) simplifies and speeds up the process. And, as we all learned in science classes, light comes in different wavelengths – and that can be anything used independently and simultaneously on the same hardware.

The same optical magic that could be used to process a signal sent from a blue laser at the speed of light works for a red or green laser with minimal modification. And if the light waves do not interfere with one another, they can travel through the same optical components at the same time without losing coherence.

That is, if a Lightmatter chip can do a million calculations per second with a red laser source, for example, adding another color doubles that to two million, and adding another color makes three – with very little modification required. The main obstacle is getting lasers that are up to the task, Harris said. Being able to use roughly the same hardware and get double, triple, or 20x the performance almost instantly makes a nice roadmap.

This also leads to the second challenge the company is working on right now connect. Every supercomputer consists of many small individual computers, thousands upon thousands of which work in perfect sync. To do this, they have to communicate constantly to ensure that each core knows what other cores are doing, and otherwise coordinate the immensely complex computing problems that supercomputing is supposed to take over. (Intel is talking about this problem of “parallelism” when building a supercomputer on the Exa scale.)

“One of the things we’ve learned along the way is how do you get these chips to talk to each other when they’re so fast that they just sit there and wait most of the time? ”Said Harris. The Lightmatter chips work so quickly that they cannot rely on conventional computing cores to coordinate between them.

A photonic problem apparently requires a photonic solution: a wafer-scale interconnect board that uses waveguides instead of optical fibers to transfer data between the various cores. Fiber optic links aren’t exactly slow, of course, but they’re not infinitely fast, and the fibers themselves are quite bulky when scaling the chips, which limits the number of channels between cores.

“We built the optics, the waveguides, into the chip itself. We can put 40 waveguides in the space of a single optical fiber, ”said Harris. “That means that you have many more lanes in parallel – that brings you to absurdly high connection speeds.” (Chip and server fiends can find these specs Here.)

The optical interconnect board is called Passage and will be part of a future generation of their Envise products – but as with the color calculation, it is for a future generation. The 5-10x performance for a fraction of the performance must satisfy your potential customers for the time being.

Make the $ 80 million work

These customers, initially the “hyper-scale” data handlers who already own data centers and supercomputers that are making maximum use of them, will receive the first test chips later this year. There the B-round is going first, said Harris: “We are funding our early access program.”

This means both building hardware for shipping (very expensive per unit before economies of scale hit, not to mention the current difficulties with suppliers) and building the go-to-market team. Service, support and the immense amount of software that comes with something like this – a lot is discontinued.

The round itself was led by Viking Global Investors with the participation of HP Enterprise, Lockheed Martin, SIP Global Partners and previous investors GV, Matrix Partners and Spark Capital. It brings their total to approximately $ 113 million; There was the first $ 11MA round, then GV jumped on with a $ 22M A-1, then that $ 80M.

While there are other companies out there exploring photonic computing and its potential uses, particularly in neural networks, Harris didn’t seem to feel like he was on Lightmatter’s heels. Few seem close to shipping a product, and in any case, this is a market that is in the middle of its hockey stick moment. He pointed to an OpenAI study that found that the demand for AI-related computers is growing far faster than is possible with existing technology, except for ever larger data centers.

The next decade will be economic and political pressures to curb this power consumption, as we’ve seen in the world of cryptocurrencies, and Lightmatter is poised to provide an efficient and powerful alternative to the usual GPU-based tariff.

As Harris hopefully hinted earlier, what his company has made is potentially transformative in the industry, and if so, there’s no rush – if there’s a gold rush, they’ve already made their claim.

Source link

Similar Posts