Covering Scientific & Technical AI | Monday, December 2, 2024

AI’s Cool New Thing: Capsule Networks (explained) 

A buzz is about in AI circles around “capsule networks,” a new variant on neural networks that backers say could simplify, cut the costs of, commoditize and, in the end, democratize how deep learning systems are taught to do what we want them to do.

How can capsule networks do all this? They hold out the hope of tackling one of the biggest problems in AI: radically reducing the amount of data (and compute) needed to train deep learning systems. This in turn means AI could become available to the broader market, no longer consigned to a few companies with mammoth compute resources and infinite volumes of data – i.e., the FANG* companies.

In fact, a FANG company, Google, is the father of capsule networks. Google researchers Sara Sabour, Nicholas Frosst, Geoffrey Hinton (the latter is the team lead and a Google engineering fellow at the company’s Brain Team in Toronto) published a paper on the topic last month. Having read, or tried to read, the abstract we decided it might be best to ask someone to explain what it all means, and for that we talked with Mike Fitzmaurice, VP of workflow technology at Nintex, which is a Bellevue, WA, company that uses machine/deep learning to improve clients’ business processes. As Fitzmaurice said, “we are a consumer of AI, not a producer of it.”

Geoffrey Hinton of Google

Fitzmaurice said capsule networks’ core idea is to break up the neural net into chunks, or capsules, that work in teams and are assigned a portion of a problem; each capsule is pre-loaded with basic training on its portion of the object or process it will examine. The individual capsules work cooperatively, sharing their findings and contributing to solving the problem as a whole.

This eases the task assigned to traditional neural networks, which figure out entire problems, not just a piece, and they start from scratch, without advance knowledge, tabula rasa. This means “brute force feeding of large amounts of raw data,” Fitzmaurice said, which is time-, compute-, data- and cost-intensive.

Capsule networks will (it’s hoped) do an end-around on this problem by pre-loading a template of the target object or process the capsule network is programmed to find (a face, for example) or evaluate (such as a business workflow). For example, a capsule network assigned to do facial recognition is trained in advance on the basic structure of the face – avoiding the steps involved in training a neural net in understanding the basics of what a face looks like. One capsule may be trained to recognize a mouth, another a nose, and so forth, then they come together to determine if a given face belongs to a given individual.

The advantages of capsule networking could have far reaching implications. “Capsule networks will make AI usable and ubiquitous,” Fitzmaurice told EnterpriseTech.

He speculated that Google may be ready to release capsule networks to the open market next year via Google Cloud. “They’re holding their cards close to the vest,” he said, “but I doubt they’d have one of their senior researchers talking about it if they weren’t getting ready to do something soon.” He also agreed that capsule networks have probably been used, and proven, internally at Google before a commercial release. “That’s exactly the way Google has operated in the past,” he said.

A key area of difference between capsule and neural is the divergent workloads they are suited to handle. Capsule networks are right for problems “less open-ended and less discovery-oriented, that are more utility oriented,” Fitzmaurice said. “This won’t spell the death of neural networks. Neural networks will do things capsule networks won’t, like find patterns that we weren’t expecting. But we don’t need that (capability) for things where the patterns are well understood, where we know the patters already.”

It means capsule networks will be effective “as long as we’re willing to make some assumptions at the beginning,” Fitzmaurice said. “And for a whole lot of business problems those assumptions are perfectly safe.  For other (types of problems) I still want a neural network, I want it to in effect ask questions, to look at all possible data and find patterns it never occurred to me to look for. But not every business process requires that.”

“If I’m willing to give it (a capsule network) some up-front data, then I don’t need to give it as much raw data,” he said, “so I as a provider could deliver democratized services to lots of people."

He said capsule networks, trained in advance on how a thing or a process should look, could lead to advances in medical diagnostics.

Mike Fitzmaurice of Nintex

“Practically speaking, I wouldn’t trust a neural network to look at x-rays because we just don’t have enough similar but different x-rays to … look for an anomaly in one compared to a bunch of others,” Fitzmaurice said. “But capsule networking might actually help here, because it’s already been taught what a normal x-ray is supposed to look like, and then it can look for anomalies within the right amount of variance.”

He foresees capsule networks capabilities offered via public cloud providers, making deep learning and AI significantly more accessible and affordable.

“I can tell you we’re reaching a tipping point this year, certainly the beginning of next year,” he said. “With AI, it used to be if you didn’t have the economies of scale in terms of data and computing power you couldn’t do AI. AI is now getting to the point of becoming ‘intelligence-as-a-service.’”

Lower cost AI not only would have broad market appeal, he said, it could also lead to more sharing of data – securely, of course.

“If enough cloud providers deliver intelligence-as-a-service, enough customers could feed it enough data that it could get pretty smart pretty quickly,” he said. “As long as there are protections for my data from your data, when it comes out I don’t mind if they’re mixed together to teach it to be smarter.

Incidentally, put down Fitzmaurice among those who aren’t fearful of technology “singularity,” the idea that artificial super-intelligence will trigger runaway technological growth that changes human civilization. For the foreseeable future, he sees AI being primarily assigned to discreet (and mundane) business tasks.

“AI is really useful,” he said, “but we think of it as replacing humans, and AI replaces humans no more than business process automation replaces humans, or word processing replaced humans, or spreadsheets replaced humans. We still need humans to do lots of things. But what AI is really good at is repeated, annoying, tedious things that humans get tired or bored with. Finding patterns in a very large data set, most humans I know would be asleep in very short order.”

*FANG: Facebook, AWS/Azure, Netflix, Google

AIwire