Americas

  • United States

Asia

mike_elgan
Contributing Columnist

The rebirth of lifelogging and the death of Gordon Bell

opinion
May 28, 20247 mins
Emerging TechnologyGenerative AIMicrosoft

The iPhone ended Gordon Bell’s lifelogging project. Now AI is bringing it back.

C. Gordon Bell
Credit: Queensland University of Technology

The tech world lost a legend earlier this month — and it happened the same week that his life-long vision was finally realized. 

I’m talking about C. Gordon Bell, the computer scientist who helped usher in the age of the personal computer. He designed the first microcomputer in 1965 — the DEC PDP-8 — among countless other achievements in the field of computing. Bell died May 17 of pneumonia at his home in Coronado, CA. He was 89. 

Bell’s lifelogging vision

Late in his career, Bell was inspired by Vannevar Bush’s hypothetical “Memex” system, which Bush described in a 1945 Atlantic Monthly articled, “As We May Think.” From that inspiration, Bell became the world’s biggest advocate and practitioner of a concept called lifelogging.

Bell launched his lifelogging MyLifeBits project in 1998. The idea was to enter all digital content from one’s life and work. From the project page: He aimed to capture digital versions of “a lifetime’s worth of articles, books, cards, CDs, letters, memos, papers, photos, pictures, presentations, home movies, videotaped lectures, voice recordings, phone calls, IM transcripts, television, and radio.” (Bell famously wore two cameras around his neck, which snapped photographs at regular intervals.) Then, he would use custom-built software to retrieve any fact, any captured idea, any name, any event on demand. 

MyLifeBits was part of Bell’s research at Microsoft. He joined Microsoft Research in 1995 and worked there until 2015 when he was named a researcher emeritus.

The death of lifelogging

Eight years ago, I interviewed Bell for Computerworld and, based on what he told me, I proclaimed in the headline: “Lifelogging is dead (for now).” What killed lifelogging, according to Bell, was the smartphone. He stopped his lifelogging experiment when the iPhone shipped in 2007.

Smartphones, he correctly predicted, would gather vastly more data than any previous device could, given their universality and ability to capture not only pictures and user data, but also sensor data. Suddenly, we had access to vastly more data, but no software capable of processing it into a cohesive and usable lifelogging system. 

He also correctly predicted that, in the future, lifelogging could return when we had better batteries, cheaper storage and — the pièce de résistance — artificial intelligence (AI) to help capture, organize and present the massive amounts of data. With AI, data doesn’t have to be tagged, filed specifically, or categorized. And it can respond meaningfully with natural language interaction. 

At the time, I wrote something I still believe: “I think we’ll find that everybody really does want to do lifelogging. They just don’t want more work, information overload or new data management problems. Once those problems are solved by better hardware and advanced AI, lifelogging and the photographic memory it promises will be just another background feature of every mobile device we use.” 

Don’t look now, but we’ve arrived at that moment. 

Suddenly: A new wave of lifelogging AI

Bell did his lifelogging research at Microsoft, so it’s especially poignant that within a few days of Bell’s death, Microsoft announced incredible lifelogging tools. (Company execs didn’t use the “L” word, but that’s exactly what they announced.)

During a special May 20 event preceding the Microsoft Build 2024 conference, the company introduced its Recall feature for Copilot+ PCs, which will run Windows 11 and sport Qualcomm’s new Snapdragon X Elite chips. (They have a neural processing unit (NPU) that makes Recall possible, according to Microsoft.)

Here’s how it works: Recall takes a screenshot of the user’s screen every few seconds. (Users can exempt chosen applications from being captured. Private browsing sessions aren’t captured, either. And specific screenshots, or all captures within a user-designated time frame, can be deleted.)

The screen-grabs are encrypted and stored locally, and the content can then be searched — or the user can scroll through it all chronologically. The secret sauce here, obviously, is that AI is processing all the data, identifying text, context, images and other information from the captures; it can later summarize, recall and generally use your screenshots to answer questions about what you’ve been doing, and with whom. The goal is to provide you with a digital photographic memory of everything that happens on your device.

Microsoft’s Recall feature is lifelogging, pure and simple. AI makes this lifelogging tool feasible at scale for the first time ever.

(Copilot+ PCs start shipping on June 18, 2024, according to Microsoft.) 

One week before Microsoft’s announcement, and mere days before Bell’s passing, Google announced lifelogging tools of its own. During a video demonstration of Project Astra, where visual AI identifies and remembers objects in the room and performs other neat tricks via a Pixel phone, the woman showing off the technology picked up AI glasses and continued with her Astra session through the glasses. 

Astra is capturing video, which AI can process in real time or refer back to later. It feels like the AI tool is watching, thinking and remembering — which, of course, it isn’t. And it’s trivial for AI to spin out a text log of every single thing it sees, identifying objects and people along the way. AI could then retrieve, summarize, process and help you make instant sense of everything you saw. 

Bell wore cameras around his neck to capture snapshots. It couldn’t be more obvious that glasses capturing video to be processed by generative AI is vastly superior for lifelogging. 

Google this month also announced another powerful lifelogging tool, which I first told you about in September. It’s called NotebookLM. The AI-enhanced note-taking application beta is free to try if you’re in the United States. The idea is that you take all your notes in the application, and upload all content that comes your way, including text, pictures, audio files, Google Docs and PDFs. 

At any point, you can interrogate your own notebook with natural language queries, and the results will come back in a way that will be familiar if you’re a user of the major genAI chatbots. In fact, NotebookLM is built on top of Google’s PaLM 2 and Gemini Pro models. 

Like the better chatbots, NotebookLM will follow its display of results with suggested actions and follow-up questions. It will also organize your information for you. You can invite others into specific notes, and collaborate.

NotebookLM is the lifelogging system Gordon Bell spent nine years trying to build. But his ideas were too far ahead of the technology.

The previous two weeks will go down in history as the most momentous thus far in the life of the lifelogging ideas since Vannevar Bush described his Memex concept in 1945. Of course, in the AI era, lifelogging won’t be called lifelogging, and the ability to lifelog effectively will be seen as something of a banality — you know, like the PC and the many other digital gifts midwifed into existence by Gordon Bell. 

I told you lifelogging was dead, until we got the AI tools. And now we have them.