In 1928, the mathematician David Hilbert posed a challenge he called the Entscheidungsproblem (which translates to “decision problem”).

Roughly speaking, the problem asks whether there exists an effective procedure (what we would today call an “algorithm”) that can take as input a set of axioms and a mathematical statement, and then decide whether or not the statement can be proved using those axioms and standard logic rules.

Hilbert thought such a procedure probably existed.

In this paper, Turing proved that there exists problems that cannot be solved systematically (i.e., with an algorithm). He then argued that if you could solve Hilbert’s decision problem, you could use this powerful proof machine to solve one of these unsolvable problems: a contradiction!

Though Turing was working before computers, his framework and results formed the foundation of theoretical computer science (my field), as they can be used to explore what can and cannot be solved by computers.

Over time, theoreticians enumerated many problems that cannot be solved using a fixed series of steps. These came to be known as undecidable problems, while those that can be solved mechanistically were called decidable.

The history of theoretical computer science is interesting in its own right, especially given Hollywood’s recent interest in Turing.

But in this post, I want to argue a less expected connection: Turing’s conception of decidable and undecidable problems turns out to provide a useful metaphor for understanding how to increase your value in the knowledge work economy…

In 2013, during a period of only three months, Stephen King published two full-length novels: Joylandand Doctor Sleep. This is unusually productive, even for a writer whose published fifty-five novels in his career (and sold over 350 million copies along the way).

Nestled half way through this list was a piece of advice that caught my attention:

“The first draft of a book—even a long one—should take no more than three months, the length of a season.”

This tip resonates with my experience well beyond just book writing. Things worth doing take time, but if they take too much time your intensity might begin to wane to unproductive levels.

I previously admitted that I don’t web surf and that I’ve never had a social media account. This next admission might be the final straw that leads to the permanent revocation of my internet privileges: I’m bad at answering e-mails.

I sometimes go a whole day without looking at my inbox (and sometimes even longer). I ignore messages. People I know well tend to call me when they really need to know something.

I’m not bad at e-mails on purpose. If anything, I’m apologetic and ashamed about it and try to be more responsive when I can. But I only have so many hours to work each day, and I tend to block as much as I can get away with for deep efforts.

This philosophy is a boon to my role as a theoretician trying to solve interesting problems, but a bane to my role as a member of a bureaucracy.

For the past two weeks, I’ve been trying to prove a bothersome theorem. It’s not particularly flashy, but I need it for a paper. More importantly, it felt like it should be easy and I took it personally that it’s not.

Predictably, I began to obsess about this proof — by which I mean I took to returning to the proof again and again during breaks in my working day. It became a staple during my commutes to and from work, and began to hijack blocks of time from my otherwise carefully constructed schedules.

Earlier this week, the weather was nice, so while waiting out the traffic at home in the morning I sat outside in my backyard with my grid notebook (something about grid rule aids mathematical thinking) and, as I had been doing, noodled on the theorem.

Except this time: something shook loose.

I scribbled notes for an hour, drove to campus, and set about trying to formalize my new idea.

It didn’t work.

But now I had the scent. Long story short, six hours later I had a proof that seems to work for a more or less reasonable version of the problem (time will tell).

I started that day with a pretty elaborate time block schedule. It was ignored; as was my e-mail inbox; as were several pretty important administrative obligations. But the important thing is that I think I finally tamed that damnable theorem.

In the article, Buffett wanted to help his employee get ahead in his working life, so he suggested that the employee list the twenty-five most important things he wanted to accomplish in the next few years. He then had the employee circle the top five and told him to prioritize this smaller list.

All seemed well until the wise Billionaire asked one more question: “What are you going to do with the other twenty things?”

The employee answered: “Well the top five are my primary focus but the other twenty come in at a close second. They are still important so I’ll work on those intermittently as I see fit as I’m getting through my top five. They are not as urgent but I still plan to give them dedicated effort.”

Buffett surprised him with his response: “No. You’ve got it wrong…Everything you didn’t circle just became your ‘avoid at all cost list.'”

A graduate student recently sent me a note asking how I keep track of potential projects in my academic work. This got me thinking, and after some consideration I decided I had two answers.

The first answer is literal…

Since September, 2004, I’ve always kept an idea notebook with me to capture spontaneous thoughts relevant to many different areas in my life, including potential professional projects. (The picture above is a sampling of my large collection of full idea books.) I try to review my current notebook every couple of months.

Eric Betzig is a research leader at the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Virginia. Last month he received a surprising and life changing call: he had been awarded the Nobel Prize in Chemistry for his work on high resolution microscopy (see the video above).

Everyone who wins the Nobel is impressive, but what makes Betzig particularly worthy of attention is his unlikely path to the prize. One thing that any non-partial observer will confirm is that if you had met Betzig in 1994, the idea that he would one day win the most prestigious award in science would seem strictly absurd.

Note: This site is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.