Categories

Tag: Kim Bellard

Pin Me, Please

By KIM BELLARD

You had to know I’d write about the new Humane AI Pin, right?

After all, I’d been pleading for the next big thing to take the place of the smartphone, as recently as last month and as long ago as six years, so when a start-up like Humane suggests it is going to do just that, it has my attention.  Even more intriguing, it is billed as an AI device, redefining “how we interact with AI.”  It’s like catnip for me.

For anyone who has missed the hype – and there has been a lot of hype, for several months now – Humane is a Silicon Valley start-up founded by two former Apple employees, Imran Chaudhri and Bethany Bongiorno (who are married).  They left Apple in 2016, had the idea for the AI Pin by 2018, and are ready to launch the actual device early next year.  It is intended to be worn as a pin on the lapel, starts at $699, and requires a monthly $24 subscription (which includes wireless connectivity).  Orders start November 16.

Partners include OpenAI, Microsoft, T-Mobile, Tidal, and Qualcomm.

Mr. Chaudhri told The New York Times that artificial intelligence  “can create an experience that allows the computer to essentially take a back seat.” He also told TechCrunch that the AI Pin represented “a new way of thinking, a new sense of opportunity,” and that it would “productize AI” (hmm, what are all those other people in AI doing?).  

Humane’s press release elaborates:

Ai Pin redefines how we interact with AI. Speak to it naturally, use the intuitive touchpad, hold up objects, use gestures, or interact via the pioneering Laser Ink Display projected onto your palm. The unique, screenless user interface is designed to blend into the background, while bringing the power of AI to you in multi-modal and seamless ways.

Basically, you wear a pin that is connected with an AI, which – upon request – will listen and respond to your requests. It can respond verbally, or it can project a laser display into the palm of your hand, which you can control with a variety of gestures that I am probably too old to learn but which younger people will no doubt pick up quickly.  It can take photos or videos, which the laser display apparently does not, at this point, do a great job projecting. 

Here’s Humane’s introductory video:

Some cool features worth noting:

  • It can summarize your messages/emails;
  • It can make phone calls or send messages;
  • It can search the web for you to answer questions/find information;
  • It can act as a translator;
  • It has trust features that include not always listening and a “Trust Light” that indicates when it is.
Continue reading…

THCB Gang Episode 137, Thursday October 26

Joining Matthew Holt (@boltyboy) on #THCBGang on Thursday October 26 at 1pm PST 4pm EST were delivery & platform expert Vince Kuraitis (@VinceKuraitis); author & ponderer of odd juxtapositions Kim Bellard (@kimbbellard); futurist Ian Morrison (@seccurve); and our special guest was Kat McDavitt(@katmcdavitt) President of Innsena.

The video is below. If you’d rather listen to the episode, the audio is preserved from Friday as a weekly podcast available on our iTunes & Spotify channels

Altman, Ive, and AI

BY KIM BELLARD

Earlier this year I urged that we Throw Away That Phone, arguing that the era of the smartphone should be over and that we should get on to the next big thing.  Now, I don’t have any reason to think that either Sam Altman, CEO of OpenAI, and Jony Ive, formerly and famously of Apple and now head of design firm LoveFrom, read my article but apparently they have the same idea.  

Last week The Information and then Financial Times reported that OpenAi and LoveFrom are “in advanced talks” to form a venture in order to build the “iPhone of artificial intelligence.”  Softbank may fund the venture with as much as $1b.  There have been brainstorming sessions, and discussions are said to be “serious,” but a final deal may still be months away. The new venture would draw on talent from all three firms.

Details are scare, as are comments from any of the three firms, but FT cites sources who suggest Mr. Altman sees “an opportunity to create a way of interacting with computers that is less reliant on screens.” which is a sentiment I heartily agree with.  The Verge similarly had three sources who agreed that the goal is a “more natural and intuitive user experience.”

Continue reading…

THCB Gang Episode 135, Thursday September 28

Joining Matthew Holt (@boltyboy) on #THCBGang on Thursday September 28 at 1pm PST 4pm EST are futurist Jeff Goldsmith: author & ponderer of odd juxtapositions Kim Bellard (@kimbbellard); and patient safety expert and all around wit Michael Millenson (@mlmillenson).

You can see the video below & if you’d rather listen than watch, the audio is preserved as a weekly podcast available on our iTunes & Spotify channels.

War — and Health Care — on the Cheap

By KIM BELLARD

Like many of you, I’m watching the war in Ukraine with great interest and much support. For all the fuss about expensive weapons — like F-16 fighters, Abrams tanks, Stryker and Bradley armored fighting vehicles, Patriot missile defense systems, Javelin anti-tank missiles, Himars long range missiles, and various types of high tech drones — what I’m most fascinated with is how Ukraine is using inexpensive, practically homemade drones as a key weapon.

It’s a new way of waging war. And when I say “waging war,” I can’t help but also think “providing health care.” It’s not so much that I think drones are going to revamp health care, but if very expensive weapons may, in fact, not be the future of warfare, maybe very expensive treatments aren’t necessarily the future of healthcare either.

Just within the last two weeks, for example, The New York Times headlined Budget Drones Prove Their Value in a Billion-Dollar War, AP said Using duct tape and bombs, Ukraine’s drone pilots wage war with low-cost, improvised weapons, ABC News reports: Inside Ukraine’s efforts to bring an ‘army of drones’ to war against Russia, and Defense News describes how Cardboard drone vendor retools software based on Ukraine war hacks.

This is not the U.S. military-industrial complex’s “shock-and-awe” kind of warfare; this is the guy-in-his-garage-building-his-own-weapons kind of warfare.

Ukraine’s minister for digital transformation, Mykhailo Federov, says the government is committed to building a state-of-the-art “army of drones.” He promises: “A new stage of the war will soon begin.”

NYT detailed:

Drones made of plastic foam or plastic are harder to find on radar, reconnaissance teams said. Ukraine buys them from commercial suppliers who also sell to aerial photographers or hobbyists around the world, along with parts such as radios, cameras, antennas and motors. The drone units mix and match parts until they find combinations that can fly past sophisticated Russian air defenses.

“The doctrine of war is changing,” one Ukrainian commander said. “Drones that cost hundreds of dollars are destroying machines costing millions of dollars.” The AP discusses how an elite drone unit – “a ragtag group of engineers, corporate managers and filmmakers” — “assembled with just $700,000, has destroyed $80 million worth of enemy equipment.”

Continue reading…

DNA is Better at Math than You Are

By KIM BELLARD

I was tempted to write about the work being done at Wharton that suggests that AI may already be better at being entrepreneurial than most of us, and of course I’m always interested to see how nanoparticles are starting to change health care (e.g., breast cancer or cancer more generally), but when I saw what researchers at China’s Shanghai Jiao Tong University have done with DNA-based computers, well, I couldn’t pass that up. 

If PCs helped change the image of computers from the big mainframes, and mobile phones further redefined what a computer is, then DNA computers may cause us to one day – in the lifetime of some of you — look back at our chip-based devices as primitive as we now view ENIAC.

It’s been almost 30 years since Leonard Adleman first suggested the idea of DNA computing, and there’s been a lot of excitement in the field since, but, really, not the kind of progress that would make a general purpose DNA computer seem feasible. That may have changed.

At the risk of introducing way too many acronyms, the Chinese researchers claim they have developed a general purpose DNA integrated circuit (DIC), using “multilayer DNA-based programmable gate arrays (DPGAs).” The DPGAs are the building blocks of the DIC and can be mixed and matched to create the desired circuits. They claim that each DPGA “can be programmed with wiring instructions to implement over 100 billion distinct circuits.”

They keep track of what is going on using fluorescence markers, which probably makes watching a computation fun to watch. 

One experiment, involving 3 DPGAs and 500 DNA strands, made a circuit that could solve quadratic equations, and another could do square roots. Oh, and, by the way, another DPGA circuit could identify RNA molecules that are related to renal cancer. They believe their DPGAs offers the potential for “intelligent diagnostics of different kinds of diseases.”

DNA tracking DNA.

Continue reading…

Poor Kids. Pitiful Us

By KIM BELLARD

Well, congratulations, America.  The child poverty rate more than doubled from 2021 to 2022, jumping from 5.2% to 12.4%, according to new figures from the Census Bureau.  Once again, we prove we sure have a funny way of showing that we love our kids.

The poverty rate is actually the Supplemental Poverty Measure (SPM), which takes into account government programs aimed at low income families but which are not counted in the official poverty rate. The official poverty rate stayed the same, at 11.5% while the overall SPM increased 4.6% (to 12.4%), the first time the SPM has increased since 2010.  It’s bad enough that over 10% of our population lives in poverty, but that so many children live in poverty, and that their rate doubled from 2021 to 2022 — well, how does one think about that?

The increase was expected. In fact, the outlier number was the “low” 2021 rate.  Poverty dropped due to COVID relief programs; in particular, the child tax credit (CTC).  It had the remarkable (and intended) impact of lowering child poverty, but was allowed to expire at the end of 2021, which accounts for the large increase. We’re basically back to where we were pre-pandemic.

President Biden was quick to call out Congressional Republicans (although he might have chided Senator Joe Manchin just as well):

Today’s Census report shows the dire consequences of congressional Republicans’ refusal to extend the enhanced Child Tax Credit, even as they advance costly corporate tax cuts…The rise reported today in child poverty is no accident—it is the result of a deliberate policy choice congressional Republicans made to block help for families with children while advancing massive tax cuts for the wealthiest and largest corporations.

Many experts agree: child poverty, and poverty more generally, is a choice, a policy choice.

Continue reading…

The Times They Are A-Changing….Fast

By KIM BELLARD

If you have been following my Twitter – oops, I mean “X” – feed lately, you may have noticed that I’ve been emphasizing The Coming Wave, the new book from Mustafa Suleyman (with Michael Bhaskar). If you have not yet read it, or at least ordered it, I urge you to do so, because, frankly, our lives are not going to be the same, at all.  And we’re woefully unprepared.

One thing I especially appreciated is that, although he made his reputation in artificial intelligence, Mr. Suleyman doesn’t only focus on AI. He also discusses synthetic biology, quantum computing, robotics, and new energy technologies as ones that stand to radically change our lives.  What they have in common is that they have hugely asymmetric impacts, they display hyper-evolution, they are often omni-use, and they increasingly demonstrate autonomy. 

In other words, these technologies can do things we didn’t know they could do, have impacts we didn’t expect (and may not want), and may decide what to do on their own.  

To build an AI, for the near future one needs a significant amount of computing power, using specialized chips and a large amount of data, but with synthetic biology, the technology is getting to the point where someone can set up a lab in their garage and experiment away.  AI can spread rapidly, but it needs a connected device; engineered organisms can get anywhere there is air or water.

“A pandemic virus synthesized anywhere will spread everywhere,” MIT”s Kevin Esvelt told Axios.

I’ve been fascinated with synthetic biology for some time now, and yet I still think we’re not paying enough attention. “For me, the most exciting thing about synthetic biology is finding or seeing unique ways that living organisms can solve a problem,” David Riglar, Sir Henry Dale research fellow at Imperial College London, told The Scientist. “This offers us opportunities to do things that would otherwise be impossible with non-living alternatives.”

Jim Collins, Termeer professor of medical engineering and science at Massachusetts Institute of Technology (MIT), added: “By approaching biology as an engineering discipline, we are now beginning to create programmable medicines and diagnostic tools with the ability to sense and dynamically respond to information in our bodies.”

For example, researchers just reported on a smart pill — the size of a blueberry! — that can be used to automatically detect key biological molecules in the gut that suggest problems, and wirelessly transmit the information in real time. 

Continue reading…

Smells like AI Spirit

By KIM BELLARD

There are so many exciting developments in artificial intelligence (AI) these days that one almost becomes numb to them. Then along comes something that makes me think, hmm, I didn’t see that coming.

For example, AI can now smell.

Strictly speaking, that’s not quite true, at least not in the way humans and other creatures smell.  There’s no olfactory organ, like our nose or a snake’s tongue. What AI has been trained to do is to look at a molecular structure and predict what it would smell like.

If you’re wondering (as I certainly did when I heard AI could smell), AI has also started to crack taste as well, with food and beverage companies already using AI to help develop new flavors, among other things. AI can even reportedly “taste wine” with 95% accuracy. It seems human senses really aren’t as human-only as we’d thought.

The new research comes from the Monell Chemical Senses Center and Osmo, a Google spin-off. It’s a logical pairing since Monell’s mission is “to improve health and well-being by advancing the scientific understanding of taste, smell, and related senses,” and Osmo seeks to give “computers a sense of smell.” More importantly, Osmo’s goal in doing that is: “Digitizing smell to give everyone a goal at a better life.”

Osmo CEO Alex Wiltschko, PhD says: “Computers have been able to digitize vision and hearing, but not smell – our deepest and oldest sense.” It’s easy to understand how vision and hearing can be translated into electrical and, ultimately, digital signals; we’ve been doing that for some time. Smell (and taste) seem somehow different; they seem chemical, not electrical, much less digital. But the Osmo team believes: “In this new era, computers will generate smells like we generate images and sounds today.”

I’m not sure I can yet imagine what that would be like.

The research team used an industry dataset of 5,000 known odorants, and matched molecular structures to perceived scents, creating what Osmo calls the Principle Odor Map (POM). This model was then used to train the AI. Once trained, the AI outperformed humans in identifying new odors. 

The model depends on the correlation between the molecules and the smells perceived by the study’s panelists, who were trained to recognize 55 odors. “Our confidence in this model can only be as good as our confidence in the data we used to test it,” said co-first author Emily Mayhew, PhD. Senior co-author Joel Mainland, PhD. admitted: “The tricky thing about talking about how the model is doing is we have no objective truth.” 

The study resulted in a different way to think about smell. The Montell Center says:

The team surmises that the model map may be organized based on metabolism, which would be a fundamental shift in how scientists think about odors. In other words, odors that are close to each other on the map, or perceptually similar, are also more likely to be metabolically related. Sensory scientists currently organize molecules the way a chemist would, for example, asking does it have an ester or an aromatic ring?

“Our brains don’t organize odors in this way,” said Dr. Mainland. “Instead, this map suggests that our brains may organize odors according to the nutrients from which they derive.”

“This paper is a milestone in predicting scent from chemical structure of odorants,” Michael Schmuker, a professor of neural computation at the University of Hertfordshire who was not involved in the study, told IEEE Spectrum.  It might, he says, lead to possibilities like sharing smells over the Internet. 

Think about that. 

“We hope this map will be useful to researchers in chemistry, olfactory neuroscience, and psychophysics as a new tool for investigating the nature of olfactory sensation,” said Dr. Mainland. He further noted: “The most surprising result, however, is that the model succeeded at olfactory tasks it was not trained to do. The eye-opener was that we never trained it to learn odor strength, but it could nonetheless make accurate predictions.”

Next up on the team’s agenda is to see if the AI can learn to recognize mixtures of odors, which exponentially increases the number of resulting smells. Osmo also wants to see if AI can predict smells from chemical sensor readings, rather than from molecular structures that have already been digitized. And, “can we digitize a scent in one place and time, and then faithfully replicate it in another?”

That’s a very ambitious agenda.

Dr. Wiltschko claims: “Our model performs over 3x better than the standard scent ingredient discovery process used by major fragrance houses, and is fully automated.” One can imagine how this would be useful to those houses. Osmo wants to work with the fragrance industry to create safer products: “If we can make the fragrances we use every day safer and more potent (so we use less of them), we’ll help the health of everyone, and also the environment.”

When I first read about the study, I immediately thought of how dogs can detect cancers by smell, and how exciting it might be if AI could improve on that. Frankly, I’m not much interesting in designing better fragrances; if we’re going to spend money on training AI to recognize molecules, I’d rather it be spent on designing new drugs than new fragrances.

Fortunately, Osmo has much the same idea. Dr. Wiltschko writes:

If we can build on our insights to develop systems capable of replicating what our nose, or what a dog’s nose can do (smell diseases!), we can spot disease early, prevent food waste, capture powerful memories, and more. If computers could do these kinds of things, people would live longer lives – full stop. Digitizing scent could catalyze the transformation of scent from something people see as ephemeral to enduring.   

Now, that’s the kind of innovation that I’m hoping for.

Skeptics will say, well, AI isn’t really smelling anything, it’s just acting as though it does. E.g., there’s no perception, just prediction. One would make the same argument about AI taste, or vision, or hearing, not to mention thinking itself. But at some point, as the saying goes, if it looks like a duck, swims like a duck, and quacks like a duck, it’s probably a duck.  At some point in the not-so-distant future, AI is going to have senses similar to and perhaps much better than our own.

As Dr. Wilkschko hopes: “If computers could do these kinds of things, people would live longer lives – full stop.” 

Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and now regular THCB contributor.