Public Health & Medicine

Public Health & Medicine

556 bookmarks
Custom sorting
Innovation at the Office
Innovation at the Office
Before Today’s Post, an Announcement: The Institute for Progress is hosting a free 6-week online PhD course titled “The economics of ideas, science and innovation.” I’m teaching one of the sessions, and Pierre Azoulay, Ina Ganguli, Benjamin Jones, and Heidi Williams are teaching the rest. An all-star lineup! The course is aimed at economics PhD students who want to learn more about the economics of innovation, but we’re also open to applications from PhD students in related fields or recent graduates. The course starts November 1, but the deadline to apply is September 6. Learn more here! Now for your regularly scheduled content… Like the rest of New Things Under the Sun, this article will be updated as the state of the academic literature evolves; you can read the latest version here. You can listen to this post above, or via most podcast apps: Apple, Spotify, Google, Amazon, Stitcher. For decades, the office was the default way to organize workers, but that default is being re-examined. Many workers (including me) prefer working remotely, and seem to be at least as productive working remotely as they are in the office. Remote capable organizations can hire from a bigger pool of workers than is available locally. All in all, remote work seems to have been underrated, relative to just a few years ago. But there are tradeoffs. I’ve written before that physical proximity seems to be important for building new relationships, even though those relationships seem to remain productive as people move away from each other. This post narrows the focus down to the office. Does bringing people together in the office actually facilitate meeting new people? (spoiler: yes) But I’ll try and get more specific about how, when, and why this happens too. One aside: this is a rich literature that goes back decades. I’m going to focus on relatively recent research that looks at scientists and startups and uses experimental and quasi-experimental approaches. But a lot of this recent work turns out to echo what earlier studies found using more observational approaches. Allen and Henn (2007) provides one overview of some of the older literature. Subscribe now Academic Collaboration Among Neighbors Let’s start with buildings. Are people more likely to work together on a project if they also work in the same building? Miranda and Claudel (2021) look at what happens to collaboration between MIT-affiliated professors and staff when they start working in the same building (or get separated), due to a series of renovation and new building projects over 2005-2015. Every year they look at each pair of 1,417 MIT authors to see if the authors’ offices are in the same building, and if they were coauthors on a paper. They want to estimate the impact of being in a building together, which presents a bit of a challenge. We might expect people to seek out offices in the same building as their expected collaborators, but they would have ended up working together whether they succeeded in getting colocated offices or not. That could overstate the impact of being in the same building. So Miranda and Claudel try to estimate the impact of being in the same building, after you adjust for a particular pair of author’s underlying propensity to collaborate regardless of location.1 Essentially, pick a random pair of MIT coauthors and identify two years where they had the same number of publications in the previous year. If they were in the same building in one of these comparison years and not in the same building in the other, they tended to publish an extra 0.004 papers together in the year they were in the same building. An extra 0.004 papers might not seem like much, but that’s because most random pairs of MIT scientists do not put out any papers together in a given year. With 1,417 MIT authors, there are over a million possible ways for them to pair off, but they only put out 38,000 papers written by multiple MIT authors collectively over the decade. That works out to about 0.004 papers per pair per year, which implies moving people into the same building about doubles the number of papers they might be expected to put out together.2 That’s about the same order of magnitude found by Catalini (2017). Catalini focuses on the Université Pierre-et-Marie-Curie and it’s 17 year quest to remove asbestos from its buildings. Asbestos removal required moving labs to new locations, typically based on what space was available rather than as a way to make inter-lab collaboration easier. Catalini also finds when labs are moved into the same building, they put out 2.5-3.3x as many joint publications as pairs of labs that are not moved together. Going Inside the Building That’s for two people (or groups) working in the same building. But buildings can be pretty big. What if we look within the building; do we see similar effects for people with offices that are closer or farther away from each other? Roche, Oettl, and Catalini (2022) peers within a US co-working space that hosted 251 different startups over 2014-2017. Whereas Miranda and Claudel (2021) and Catalini (2017) needed to try and convince us that building moves were basically random due to renovation, in this case the startup residents actually were randomly allocated to different places in the co-working hub. Very convenient for the researchers! A difficulty is startups do not typically collaborate on easily observable projects like scientific papers though. Instead, Roche, Oettl, and Catalini look for evidence that the startups trade information using data from BuiltWith that describes which web technologies startups use. For example, NewThingsUnderTheSun.com is in the BuiltWith dataset, and it shows I use CloudFlare for a bunch of stuff, and that I registered the domain name from Tucows. Suppose I moved into a coworking space with a bunch of startups that used a web technology called Mixpanel for A/B testing. Roche and coauthors can see this in their dataset. If I started using Mixpanel myself to do A/B testing for NewThingsUnderTheSun.com after moving into the coworking space, then that suggests I learned about Mixpanel from some of the other startups there. Roche, Oettl, and Catalini measure the shortest walking distance between each pair of startups on the same floor (walking distance is the shortest path you could actually walk, respecting walls, furniture, etc) and then they look at the probability startups adopt each other’s component web technologies. As you might expect, the closer two startups workspaces are, the more likely they are to use each other’s stuff. What’s perhaps a bit surprising though is that the effect of distance is highly nonlinear. Divide the startup pairs into four groups, based on their proximity, and you find only the 25% that are closest exhibit any knowledge sharing. It looks like being in the same building only matters if you are actually really close - like, within 66 meters! Additional probability of adopting another startups web tech, dividing distance into 4 bins. From Roche, Oettl, and Catalini (2022) This echoes a common finding in some of the older literature I alluded to earlier. Proximity matters, but for most people the value of proximity falls off very fast. If you have to walk very far to talk with a colocated coworker, then that coworker might as well not be colocated. Hasan and Koning (2019) get similar results in the context of a startup bootcamp in India. They randomly assign 112 aspiring entrepreneurs to 40 different teams, whose location in a large open co-working space is also randomly assigned. Bootcamp attendees spent their first week developing a project that was later evaluated by the team, and Hasan and Koning study how proximity between teams affected their interactions during this week. To measure interactions, they survey people after a week (do you know this person? Did you ask them for advice?) and also see if they sent each other more messages via email or Facebook. As with Roche and coauthors, the impact of very minor distances seems to matter a lot. The probability bootcamp attendees reported they knew, sought advice from, or frequently messaged people on other teams dropped rapidly as distance increased (focus on the black lines below, for now - we will discuss the dashed ones shortly). Probability of working with members of other teams (vertical axis, black solid line), as a function of walking distance (horizontal axis). From Hasan and Koning (2019) It’s also worth noting that all the teams in this study were as close as the teams in the first quartile of the Roche, Oettl, and Catalini (2022) study, so even among the top 25% closest startups, it seems likely the very closest exchanged most of the information. And note, in both of these studies, the locations of teams was random - it’s not as if people were grouped by the similarity of their work. And yet, proximity seemed to matter quite a bit for information sharing anyway. Communication or Discovery? So far, we’ve found evidence that jamming people together in a building increases the probability that they exchange information and start joint projects, especially if their workplaces are very close within the building. This could be for at least two different reasons though. First, being close might make it easier for people to communicate. We know this is true, in the sense that you literally don’t have to walk so far to talk face-to-face with someone who is nearby. If face-to-face conversation is a much better way to trade information than digital messaging, then we expect close coworkers to trade more information. They might also decide to start more scientific projects together, because they know it’ll be easier to complete those projects when it’s so easy to communicate. Call this the communication advantage of proximity. Second, being close might make it easier to meet new people. You might not march across the room to introduce yourself to someone you...
Innovation at the Office
Creative aliveness: turning life into a creative adventure
Creative aliveness: turning life into a creative adventure
The advent of modern creativity means that everyone, not just those few inspired by the Muses, is invited to transform and shape the world. Each day, people connect ideas together, solve problems, and invent novel solutions. However, the explosion in individual innovation has also led to the proliferation of structured creativity — neatly compartmentalized pockets of creative exploration designed to achieve specific goals in a specific context. In those well-defined spaces, creativity is decoupled from failure and stripped from its messier components. At work, we participate in a brainstorming session following predefined steps with our coworkers. Then, we join a “paint and sip” class in the evening with a friend. In both cases, we are guaranteed some form of creative output, and the boundaries between creativity and productivity are slowly dissolving. Creative aliveness consists in reclaiming a larger creative canvas woven into the fabric of our lives. It starts by asking yourself: what makes you come alive creatively? And how can you inject more creativity into your daily life? The flow of creativity The definition of creativity has greatly evolved throughout the centuries and across the continents. In the Western world, creativity was for a very long time considered the realm of divinities. Whether they received their inspiration from the Muses in Ancient Greece or the “Creator” in Christianity, humans were considered incapable of creating — limited to imitating or following the rules set by a higher entity. It wasn’t until the Enlightenment that the concept of individual creativity started to emerge. Instead of emanating from divine forces, creativity started to be considered a human ability, which led to many innovations in science and technology. During that time, researchers Weihua Niu and Robert Sternberg explain that “creativity is also viewed as a property that belongs not only to a select few; everybody can exercise it. And its exercise can occur within the mundane experiences of life, not just in the formulation of significant scientific, artistic, or other achievements.” On the other hand, the concept of “natural creativity” was prevalent for many centuries in the Eastern world. For instance, the Book of Changes states that yin-yang movements — how yin and yang mutually change from one to the other — create everything. Individual creativity is nowadays also celebrated in Eastern culture, but the legacy of classical Eastern philosophy makes it slightly different from its Western counterpart. While Western creativity is based on the concept of novelty, Eastern creativity is based on the concept of change. Thanks to the unity of nature and human thought, humanity can participate in the process of the development of the universe, and creativity is seen as a state of flow rather than a sudden burst of innovation. In modern Western and Eastern cultures, creativity is now understood as deeply embedded in the fabric of daily life. From something that is exclusive to higher-level entities, whether deities or nature itself, it has evolved into an ability anyone can practice at various levels, whether it is to create art or to perform small acts of innovation as part of the everchanging movement of the world we inhabit. But, somehow, this creative aliveness is not what most of us experience in our daily lives. Living a creative life When we think about creativity, we often picture big “C” creativity: a musician composing a song, an artist drawing a portrait, an author writing a novel. However, as both Western and Eastern philosophies have converged towards, creativity is a fundamental element of human nature, an ability we use every time we come up with a new idea or find a solution to a novel problem. It goes hand in hand with being human. Put simply, creativity is imagination in action, and this action can take care virtually anywhere. The biggest creative canvas is our own life. The possibilities are practically infinite, the tools at our disposal are many, and each person we connect with opens the door to a potential creative collaboration. However, life can get so busy we end up squeezing our creativity into a tiny canvas — small, pre-defined spaces such as an art class or a brainstorming session. Instead of constraining our creativity to these bounded containers, creative aliveness elevates our day-to-day experiences by considering each moment as an opportunity for curiosity and innovation. Whether it’s the way you approach a conversation with a person you just met, changing your itinerary to go to work, or composing a little poem in your head while in the shower — creativity can happen anywhere, at any time if you direct your mind to it. So, what exactly makes a creative mind? There are three key pillars at the foundation of living a creative life. A creative mind is… Connected. Creative aliveness requires inspiration, which can take many forms. A creative mind is connected to people who stimulate their imagination, to networks that foster collaboration, and to sources of content that provide them with interesting information. Curious. Asking questions is one of the best ways to live a creative life, whether you ask these questions to yourself or to other people. Curious minds tend to question the obvious, to consider alternative answers, and to consistently dig one level deeper to get to the core of how things work. Courageous. Failure is the other side of the creativity coin. While creativity doesn’t have to be grandiose, it requires to step into the unknown, and sometimes to take bold risks with limited knowledge of the potential outcomes. A creative mind is also a healthier mind. While research into the relationship between creativity and mental health has historically focused on case studies of psychopathology (the trope of the “mad genius”), recent meta-analytic studies suggest that engaging in creative activities can benefit physical and mental health. And the good news is: creative aliveness can be cultivated, without the need for complicated training or systems. How to practice creative aliveness Creativity can manifest itself in many different contexts and forms, and there are practically infinite ways to come alive creatively. Those three principles can help you get started in practicing creative aliveness and reclaiming more space for embedded, spontaneous creativity. Make space for unstructured creativity. You don’t need to create something useful or beautiful. You could doodle while watching your favorite TV show or imagine alternative endings to a novel you just read. In fact, you don’t need to produce anything at all. Dancing in your living room when you’re alone or dreaming about what the perfect city would look like — these are also acts of creativity. In addition to your more structured times for creativity, inject little acts of creativity into your daily life. Consider each interaction as a creative playground. Let creativity permeate each moment of your life, whether you interact with a person, a piece of content, or a problem you want to solve. How can you approach this interaction differently? How can you inject some randomness into the interaction? For example, you could skip the small talk and ask a new acquaintance a big question instead or create a drawing out of an article you’re reading. Write as an act of self-creation. Journaling allows you to not only document, but also to shape your mind and thus your life. It’s a powerful way to turn life into a creative adventure: by capturing what has happened and thinking about what you want to happen next, you are literally writing the story of your life. As Flora Bowley wrote in her book The Art of Aliveness: “Creative adventures can be as messy or involved as you choose, but they don’t need to be fancy, groundbreaking, or even take a lot of time to be valid and effective. In fact, simple acts of creative expression and innovation woven into daily life have an incredible way of soothing, stirring, and reminding us what it feels like to be alive. In turn, flexing our creative muscles fortifies our ability to be more tuned in, observant, and adaptable in all parts of our life.” Creative aliveness is a perpetual state of self-creation that allows us to shape our experiences and the world around us. By injecting creativity into the way we navigate the world, approach our tasks, and nurture our relationships, we can live a more fulfilled, exciting life. The post Creative aliveness: turning life into a creative adventure appeared first on Ness Labs.
Creative aliveness: turning life into a creative adventure
How to switch from Roam to RemNote
How to switch from Roam to RemNote
Note-taking apps for networked thinking such as Roam Research have made it easy to create connections and generate new ideas. However, if you need to study and memorize the content of your notes, you may benefit from switching to a note-taking app for learning, such as RemNote. With the promise of being your all-in-one study companion, RemNote is an excellent app for students and teachers. If you are thinking of switching from Roam to RemNote, continue reading to explore why and how to do so. Why you may want to switch from Roam to RemNote While both apps share features such as bi-directional linking, a sidebar for viewing multiple notes, and block-based outlining, they have different use cases. Here are some key considerations to take into account if you are thinking of making the switch from Roam to RemNote. Studying RemNote is not just a note-taking app with backlinks. It also has features such as flashcards, spaced repetition, and PDF annotations, making it easy for students to learn and retain information. While these features are also available in Roam with plugins, they are built-in in RemNote, are less buggy, and require less set-up. With RemNote, you can take notes and turn them into flashcards. Research has shown that taking notes is not sufficient for studying — you need to test yourself in spaced-out intervals, and one of the best ways to do that is with flashcards. This makes RemNote a powerful tool for students to learn their course material. PDF annotations Roam does not allow you to annotate PDF files. With RemNote, you can attach a PDF to your database and annotate directly on it. You can take notes on the PDF, highlight them, and refer to them in your notes and flashcards. With this feature, you can attach lecture slides and ebooks to your database to quickly refer to the source when reviewing your flashcards and notes. Data security As with all cloud-based apps, there is always a danger to storing your apps in them. There is a possibility that the company might go down one day, taking your notes down with them. It is hard to commit to a note-taking app fully and trust it with your magnum opus if there is a threat of going to waste if the company goes down. Although companies might say they are in for the long run, there is no guarantee. Even if you can backup your notes, there is also the problem of not being able to access them due to proprietary formatting. While RemNote is a cloud-based app, they guarantee that they will release the code as open source should it wind down in the future, ensuring the longevity of your notes and flashcards. Why you may not want to switch from Roam to RemNote Now that we have explored the key differences between Roam and RemNote’s features, you must wonder: should you make the switch? If the following features are essential to your workflow, you may want to stick with Roam. Query functionality One of the Roam’s best features is its query function. You can easily find notes you want and filter them accordingly by writing syntaxes in a block. However, RemNote’s limited query function is not as powerful as Roam’s. Backlink filters In Roam, you can use filters when searching through your backlinks. This feature is helpful if you need to constantly filter the linked and unlinked references that you have made in Roam. However, RemNote does not have this feature, and it could be a dealbreaker if you need to filter through the backlinks you make constantly. Project management Roam also has extensive project management features compared to RemNote. You can create to-do lists, tick them off, and schedule the tasks with the date picker function. Roam also has other features for project management, such as Pomodoros, Kanban boards, and tables. How to migrate from Roam to RemNote If you are convinced about making the switch, going from Roam to RemNote is simple and only takes two steps. 1. Export your Roam database.  In Roam, go to the three dots in the right corner, and select Export All from the drop-down menu. Select the format as Markdown, and click on Export All. 2. Import database into RemNote.  Next, click on your profile on the left sidebar, select the import function, or click on this link. From the dropdown menu, choose Roam. Then, click Add File and select the zip file of your Roam database. Once that is done, click the Import Roam graph to start the import. This import will take some time, depending on the size of your Roam database. Getting used to RemNote Congratulations, you have now successfully migrated to RemNote. Let us explore some features of RemNote to get used to it in no time. 1. Use the web clipper. When going through web pages, you can save, take notes, and make references directly by using the RemNote web clipper. 2. Everything is a Rem. In Roam, pages and blocks are two different things. In contrast, RemNote treats all information you input into it as a Rem. There is no distinction between a page and a block, allowing for better hierarchical outlining. 3. Universal reference inserter. Unlike in Roam, where you can reference pages using double brackets [[]] and reference blocks using (()), since everything is a Rem, you only need to use double brackets to reference. This reduces cognitive load and makes it frictionless to reference. You no longer need to think whether the information you want to reference needs to be invoked with double brackets or parentheses. 4. Create Rems. As mentioned earlier, RemNote has built-in flashcard features, which they call Rem. You can make several types of Rems, from the single answer Rem to cloze (fill in the blank) Rem. You can create a Rem by typing your question in the block, adding double colons (::), and writing your answer after the double colons. To review your flashcards, click on Flashcards on the left sidebar. The spaced repetition algorithm will then resurface your flashcards according to how much you can recall the information in the flashcard. 5. Daily notes. Like in Roam, the first note that opens when launching the app is the Daily Notes page. The Daily Notes make it frictionless to write and journal. As always, beware of shiny toy syndrome when thinking about switching apps. Consider the goal you are trying to achieve with your tool for thought. If you are looking for a tool for learning, switching to RemNote might be a good idea. If generating new ideas and cultivating a digital garden is essential to you, you might be better off sticking to Roam. The post How to switch from Roam to RemNote appeared first on Ness Labs.
How to switch from Roam to RemNote
Mimetic learning: the power of learning through imitation
Mimetic learning: the power of learning through imitation
We all know that children learn through imitation. They observe and then mimic their parents when learning how to speak, perform new motor skills, and interact with others. What you may not know is that mimetic learning is a lifelong process. In adulthood as well, the way we behave is heavily influenced by how others conduct themselves. Every single day, we are exposed to the actions of others — whether it’s friends, family, colleagues or those in the public eye — but may not realise that mimetic learning is taking place, which can lead to unintentionally copying unproductive behaviours. However, with a better understanding of mimetic learning, you can harness this powerful tool to foster better personal and professional growth. The science of mimetic learning Mimetic learning is a form of social imitation that is essential for learning how to behave and interact with others. From an evolutionary perspective, mimetic learning makes a lot of sense. It’s essential for our survival and sense of belonging, with one generation showing the next the behaviours required of them. In the 1960s, psychologist and father of cognitive theory Albert Bandura first described mimetic learning as a type of social learning in which we observe the actions of others, and then develop similar behaviours ourselves. This is especially true if our experience of observing someone else feels positive or rewarding, which means that unlike cognitive or behaviourist theories of learning, mimetic learning has a strong social element. As he explains: “Most human behaviour is learned observationally through modelling: from observing others one forms an idea of how new behaviours are performed, and on later occasions, this coded information serves as a guide for action.” Bandura described three main steps to mimetic learning: observation, imitation and modelling. With observation, we simply observe the way others behave. Then, the observed action is copied through imitation. Finally, when we see someone as a role model, we assimilate the imitated behaviour through modelling, which leads to consistently replicating that person’s actions. Dr Christoph Wulf noted that mimetic learning does not mean blindly copying someone else, but instead, observing someone else’s actions to enable “enhancement of one’s own world view, action and behaviour.” The result may not be an exact copy of the original behaviour, but will be integrated with our pre-existing set of patterns. However, because it’s such a powerful tool, mimetic learning can be taken too far. In their famous 2004 study, Dr Victoria Horner and Dr Andrew Whiten demonstrated that, when compared to chimpanzees performing the same task, human children were more likely to emulate the behaviours of adults exactly, even if some actions were irrelevant to completing the task. This process is called over-imitation. As adults, therefore, it’s crucial to understand how we can make the most of mimetic learning without falling prey to ineffectual over-imitation. Mimetic learning in the workplace Observation and imitation in the workplace occurs in several ways. In many situations, we may not consciously be aware that we are observing a colleague’s actions which we may later imitate. Stephen Billet observed that most of our learning in life takes place organically as part of the process of living and growing as individuals, rather than in the form of an organised educational or professional activity. The first type of mimetic learning is through learning from a live model. For instance, if you notice a colleague completing a project in a succinct, organised manner, then you may mimic their actions to complete your own tasks in a similar fashion. Upon seeing how others achieve their goals, mimetic learning allows you to emulate their conduct to work towards your own success. But this form of mimetic learning can also have unintended consequences. If you observe a colleague cutting corners and still achieving their goal, you may unconsciously model their actions, and start taking questionable decisions when managing your projects. Another way that mimetic learning occurs is by observing a verbal instruction model. Rather than directly witnessing the behaviour, your colleague might describe the behaviour so that you can emulate it. With a verbal instruction, the framework is provided, but you must learn to adapt your behaviour accordingly. The verbal instruction model can be highly effective, but leaves more room for miscommunication, which means that you may misinterpret some steps in the behaviour. Finally, mimetic learning doesn’t have to involve a real-life situation. The last form of mimetic learning involves symbolic models. Dr Rivi Frei-Landau and colleagues found that observing a simulated situation can be a valuable teaching tool. Although participants in the study knew that they were watching a pretend scenario unfold, mimetic learning was achieved as observers benefited from “adopting multiple perspectives, balanced emotional involvement [and] cognitive critical thinking”.  That’s why symbolic models of behaviour observed in pre-recorded training sessions or online seminars can also be beneficial for mimetic learning. But that’s also why we may mimic behaviours that we observe in online personalities or influencers, or even characters featured in novels or on television shows — often without realising it. The five pillars of mimetic learning When done properly, mimetic learning can be an impactful way to increase your skill set, productivity and performance at work. Make sure to rely on those five pillars so you can make the most of mimetic learning while avoiding its potential pitfalls: Attention. Notice when you are observing a desirable behaviour so you can give it your full attention. Whether it’s a live model, a verbal instruction model, or a symbolic model, try to avoid distractions and deeply engage with the model you are observing. Retention. You will quickly forget a desirable behaviour if you simply observe it once without any retention mechanism. To retain the new information, take notes on what you witness to refer back to when you begin replicating the behaviour in the future. Reproduction. Observing and appreciating a behaviour is not the same as being able to demonstrate it yourself. You must fully understand what you’re trying to imitate. Regular practice of your new desired behaviour will be required to ensure you can perform the behaviour correctly. Try to demonstrate the behaviour on your own, and go back to your colleague to ask questions if any step is unclear. Integration. To develop a new professional behaviour, you need to integrate it into your existing patterns by incorporating the desired behaviour into your daily work. For more important behaviours — and especially if you are in a leadership position — you may even want to consider coaching as a way to speed up the integration process. Motivation. For mimetic learning to happen, you must remain motivated to demonstrate the desired behaviour over the long term. A good way to maintain motivation is to track your progress through journaling. Ask yourself: when did you try to demonstrate the desired behaviour, what were the outcomes, what could you have done better? To reap the benefits of mimetic learning, you must remain aware that not all forms of imitation are beneficial and over-imitation could be detrimental to your progress. Blindly imitating someone could lead you to copy behaviours that offer no personal or professional benefit. By fully focusing your attention when you observe an interesting behaviour, you can decide which to invest your time learning to emulate, and which to reject. Mimetic learning transforms the simple observation of others’ actions into fuel for self-development. It allows us to incorporate observed behaviours in our personal and professional lives to drive our own personal growth. To enjoy the benefits of mimetic learning, make sure to consciously distinguish between valuable and inefficient behaviours, and remember that some actions can be performed in a manner that is personal to you, while still leading to the same overall outcome. The post Mimetic learning: the power of learning through imitation appeared first on Ness Labs.
Mimetic learning: the power of learning through imitation
Can you get a doctorate online and should you?
Can you get a doctorate online and should you?
A Doctor of Philosophy (PhD) is the highest level degree awarded in a specific academic field. Beyond the title, a PhD can be exciting for many reasons. It’s a way — not the only way — to collaborate with passionate people on important problems, to contribute new knowledge to the world, and to open doors to careers at the highest levels. In many fields, a PhD is a requirement to be hired as a researcher, a scientist, or a university professor. However, a PhD is time consuming. If you are not living close to a university and if you already have other obligations that prevent you from joining an in-person PhD program, you may be considering an online PhD. The short answer to whether you can get a doctorate online is: yes, it is possible. The longer answer is that it comes with many caveats, limitations, and risks that need to be navigated carefully. But, with the right approach and some good amount of preparation, some fields of research can be great candidates for an online PhD. The pros and cons of an online PhD Similar to all distance learning programs, the main advantage of an online PhD is the flexibility. You can work from anywhere and in your own time. This flexibility may be attractive if you already have other obligations, such as being a stay-at-home parent, being a caregiver for a family member, not being able to move cities to get closer to a university, or having a job that you cannot quit. The main disadvantage of an online PhD — provided that it is provided by a reputable university — is the isolation, which is especially problematic in academic research, where a lot of the ideas are generated through serendipitous conversations with colleagues. When getting a doctorate online, you will be working on your own most of the time, which can feel lonely and demotivating. However, this challenge can be mitigated by participating in academic conferences, whether online or in person, where you can meet with fellow colleagues, exchange ideas, and grow your network. With social media, it has also become easier to share your work with the world and to foster conversations outside of your immediate academic group, which can be particularly helpful if you are pursuing an online doctorate. Another disadvantage is that you will be limited in terms of areas of research, as many academic fields rely on the use of specific equipment that can be costly, or even impossible, to acquire on your own. There is little you can do about this limitation, except choosing an area of research that doesn’t require such equipment. Finally, it is a common misconception that getting a PhD online costs less than in-person ones. Most universities charge the exact same price for online doctorates as they do for the ones delivered on campus, though some of them include the price of plane tickets and accommodation for when you absolutely need to be on-site, namely for the final oral examination where you present your thesis. In addition, it is very rare — practically unheard of — to obtain institutional funding for an online PhD, so you will need to cover the costs of your tuition and your research, which can be a considerable amount of money. Avoiding online PhD scams The pros and cons outlined above only concern legitimate online PhD programs. Unfortunately, bad actors are capitalizing on the fact that many people would like to pursue a doctorate without having the ability to do so in-person. Diploma mills are organizations that claim to be a higher education institution, and that deliver diplomas in exchange for a fee. Those diplomas are either fake or practically useless. Diploma mills exist at all levels of study, but they are particularly problematic at the doctorate level where you will find yourself working for three to six years, which is a significant investment of time and money. It can be hard to tell these organizations apart from legitimate institutions, as they often have beautiful websites, a brochure, and a call center where you can ask questions. To make matters worse, diploma mills are often supported by accreditation mills, set up for the purpose of providing an appearance of legitimacy. There are several red flags you should pay attention to if you are considering an online PhD.  Accreditation. Is the organization accredited by a nationally recognized accrediting agency? Beware of buzz words such as “licensed”, “authenticated”, or “notarized”, which are not relevant to academic credentials. It is always worth looking up the accreditation agency to ensure it is legitimate itself. Admissions. What are the criteria to join the PhD program? Do you have to submit transcripts, go through an interview process, submit an implication that includes referral letters? If not, it is a good indication that the organization is a diploma mill that is only interested in your credit card. Studies. Will you need to conduct actual research, work under a supervisor, submit a thesis, and defend it at the end of your PhD, or do you only need to maybe watch a few pre-recorded videos, or submit some work that will not be evaluated? Duration. How long will you need to study for the PhD? Depending on the country, a PhD can take anywhere between three and six years, sometimes longer. If the organization promises a fast turnaround for the obtention of your doctorate, this is a massive red flag. Faculty. Who are the members of the academic team? All reputable universities have very transparent registries of their staff, so you can have a look at their publications, and even email them to discuss your PhD project. Diploma mills will have fake team members with stock photos, and of course no academic publications to their names. A lot of universities, even the smaller ones, will have a Wikipedia page where you can read about their history, departments, and alumni. This is not in of itself a guarantee that you are looking at a legitimate institution, but you should be extra careful when the only material you can find about an organization is the one they have published themselves. Now that you are sure that you are looking at a legitimate institution, let’s figure out what research areas would be good candidates for pursuing an online doctorate. Research areas compatible with an online doctorate While in the vast majority of cases, it is more productive to be working on-site with your team, including your supervisors, there are three main factors that make an online PhD a reasonable option. All the research can be conducted online. This is obviously the most important factor. Psychology, philosophy, history, mathematics, sociology, theoretical physics, and marketing are some examples of fields where you may be able to conduct all of your research online, without impacting the quality of your output. Focus groups, interviews, content analysis, questionnaires and ethnography are all methods that can be used online. The research could also be done by accessing online librairies, conducting web-based experiments, or analyzing existing datasets.  The research benefits from being conducted off-campus. In some cases, you may actually need to be away from campus to conduct your research! Many scientists who work in anthropology, botany, or animal studies need to study their subjects in their environments. They spend their time in the field, whether they collect data, and rarely spend time on campus. The research does not require in-person resources. Will you need access to a laboratory or expensive equipment that you cannot acquire on your own? If you want to conduct brain scanning experiments, an MRI machine costs between $300,000 and $1 million, and even if you somehow had access to that kind of money, it’s unlikely you’ll be able to fit one in your living room. In contrast, if you are doing computer neuroscience, you may be able to build computational models of the brain from your laptop, without needing access to a laboratory. Again, a PhD is an intense endeavor, and most PhD candidates will benefit from having in-person interactions with their team, but an online PhD is feasible if some of these criteria are met. How to apply to an online doctorate The process should be fairly similar to applying to an in-person PhD program. First, you need to identify an area of research that is compatible with being conducted fully online and does not require in-person resources. Bonus points if the research actually benefits from being conducted off-campus.Once you have an area of research in mind, you need to find a relevant PhD program. In some cases, you will need to first find a supervisor that is willing to work with you. Go through the university’s directory and email potential supervisors, explaining what your research interests are, sharing your previous experience and why you think they would be a good match as a supervisor. Once you have a supervisor onboard, they will walk you through the application process. In other cases, you will be assigned a supervisor once accepted onto the PhD program. Then, it’s just a matter of following the instructions on the university’s website, which will often require you to upload your transcripts from previous studies as well as one or several letters of referral. You will often have to go through some interviews with faculty members, who will assess whether you would be a good fit for the program and for academic research in general. These are very similar to job interviews. Only once you are done with the application process and have been extended an offer will you be asked to pay for your fees — and it is very common for the payment to only be required after a few months of studies, rather than before you start. Again, pay attention to red flags when applying for an online PhD, as there are many scams out there that only care about getting you to pay the fees. List of online PhD degrees from reputable universities Finding an online doctorate is not hard. A quick search will bring ...
Can you get a doctorate online and should you?
Create a network of thoughts with Steffen Bleher and Michael von Hohnhorst co-founders of Capacities
Create a network of thoughts with Steffen Bleher and Michael von Hohnhorst co-founders of Capacities
Welcome to this edition of our Tools for Thought series, where we interview founders on a mission to help make the most of our thoughts. Steffen Bleher and Michael von Hohnhorst are the co-founders of Capacities, a graph-powered note-taking tool to save, connect, and organize ideas so you can be inspired and create lasting knowledge. In this interview, we talked about the concept of a second brain, the analogy between cities and minds, how thinking in hierarchical categories limits our creative thinking, how we all are chronological thinkers, how to use tags to form a meta network of your content, and more. Enjoy the read! Hi Steffen and Michael, thank you so much for agreeing to this interview. What inspired you to help people practice networked thinking? We live in an age of information abundance — and an equal amount of distractions. The most popular apps today are designed like slot machines: Netflix, Instagram, TikTok. We stream and scroll for the dopamine to kick in. We consume like animals in a lab, not realizing that we’re trapped in a social media maze. Yet, each one of us is trying to acquire a unique combination of specific knowledge — to create, to design or to invent. If we don’t want to drown in this flood of information, we have to take back control and create our own systems. A second brain is such a system. It’s the opposite of social media and news: it doesn’t control you, it empowers you. It’s a place without distraction — a studio for your mind. The term “second brain” was developed and popularized by Tiago Forte. It’s a framework to organize your digital life. Building a second brain means creating an external structure with all your thoughts, ideas, interesting content and media, learnings and experiences. The externalization in itself structures your thinking. Furthermore, it’s explorable like your mind: you can search and browse your thoughts. It resurfaces your ideas and helps you discover new connections. But we realized that creating such a system is really hard. When we looked for tools to do this, we couldn’t find anything that was built to tackle this challenge while still being simple to use and accessible to most of us. After some research we identified two major categories of apps that try to solve this problem. On the one hand, there is the traditional approach of folders and tables (or databases). They appeal to a broad audience because they are simple to use. We are used to organizing information in folders because that’s how computers worked since we invented them. While folders and sheets create a robust structure to collect and organize they contradict our natural way of thinking. Most ideas cannot be locked into a single box. We think in associations and relationships, one idea sparks another. That’s where the creative process takes place. After using these tools for a while, we found that they are somewhat blocking our minds: thinking in hierarchical categories limits our creative thinking. On the other hand, we found a branch of tools that takes a radically different approach: you create documents and blocks from the bottom up and link them on the fly to any other document or block that might be related. The tool then allows you to browse these connections in both directions. These tools map our thinking but are difficult to use. Their interfaces look like code editors — not like places where we want to be creative. They appeal to a techy audience — but their complicated syntax and a myriad of plugins make them difficult to use for most of us. We envisioned a tool that could solve this problem by combining the two approaches, a tool that becomes a natural extension of our mind and our personality. We both have a background in information theory and are connected by the excitement about information structuring and the power that can emerge from it. We combined this with our experience in artificial intelligence and user interface design. After nearly one year of endless discussions, collaborations with learning research and AI institutes, and of course many cups of coffee we came up with a data model that provides a robust organizational structure but also lets you think and associate content in any imaginable way. At that point, Capacities was born. Can you tell us how it works? Let us take you on a journey through the concepts that define Capacities and explain how they can help you structure your digital life. Imagine a city where every building was exactly the same. I think we agree this isn’t a place you wanted to live in. Buildings have functions: offices are built to work in, houses are designed to live in, and theater halls are created to bring people together. The same is true for our notes — they’re not all the same. Our minds are colorful. We think in terms of ideas, questions, people, or meetings. These notes have different purposes and properties. If all notes were the same we’d lose track and our note-taking would become monotonous very fast. In Capacities, we call these different notes entities and every entity has a type. An image, for example, is a first-class citizen in Capacities, it’s a piece of content with its own properties such as a title, tags or notes – in short, an entity of type image. The same applies to bookmarks, tweets, or files. Entities of the same type share the same structure and a design that differentiates them from other content and supports their function. Each type of entity gets its own database. For example, there’s a database where you will find all your images, tweets, or pages in one place. This gives you a great base level of organization for all your content. In addition to these basic types you can create your own types with custom properties to adapt the system to your needs. You can for example define the type “Person” which has properties like profession, company, contact information, and notes. Or you can create a type “Meeting” which has a date, notes on what was discussed, follow-up tasks or a list of attendees — there are basically no limits. These types make your note taking more colorful. You just create what you have in mind from everywhere within the app and the structure guides you in filling your entities with life. Now back to our city: different types of buildings make the city much more colorful. But now imagine all office buildings would be in one place, all residential buildings in another and every building could just be reached from a single street. That would be unpleasant as well, at least for us as Europeans. You’d have to travel a lot every time you need something different. That’s exactly the equivalent of using folders and tables to organize your thoughts. Every note lives at one place and you have to navigate to that location if you want to work on it. In Capacities, all notes are interconnected. They don’t have a fixed location. Mathematicians call this structure a graph. Every entity is a node in a network and can be connected to others. And the most important thing: in Capacities, this happens by simply taking your notes. When you create a meeting and add some attendees, they are all linked to that meeting, so once you open the page of a person you see all meetings you had with her. In the paper draft you’re working on you can directly embed ideas, or create questions, add PDF references or tags. When you browse through all your open questions you see where they were created and can quickly jump to that content. In your text you can quickly link to concepts and definitions you already created. When you come back to that definition in a few weeks you will see all text references at the bottom so you can quickly get an idea of the topic again. This network becomes more powerful over time as more and more content is added. You can explore it when you want to get inspired. It helps you find connections you haven’t thought about. It might even spark an idea for your next project. Capacities becomes a super power for creative thinking. Daily notes are a cornerstone of Capacities. Can you share a little bit more about your unique approach? Next to types and the network of ideas, daily notes are actually the third fundamental pillar of Capacities. If you allow us to come back to our city metaphor once again this will become clear. Beautiful cities grew over time. They change, adapt, and get reorganized. There are a lot of different time periods baked into the identity of a place. That’s the beauty of historic cities. Again, the same is true for your thinking. We all are chronological thinkers — time is a fundamental dimension of our learning process. And that’s where daily notes come into play. For each day you get one note for whatever feels important on that day. It can be the place where you start working and then branch out from there. Every other entity you’ve created on a day is also displayed below the daily note so you automatically get a nice overview of all your work. Next to individual daily notes we provide weekly, and monthly views that allow you to zoom out — it’s like a calendar for your thoughts. In your notes you can always reference a day, for example, by just typing “+today”. The current paragraph will then be shown as a backlink on today’s page of your calendar view. But this is just the beginning. We will integrate the time dimension more and more into your note-taking.  For example, we will add tasks as entities to Capacities that will be integrated in your daily notes dashboard, you will get daily statistics on your work and Capacities will resurface content based on time and context to help you create new connections, learn from your knowledge, and hopefully spark new ideas. With so many types of digital content you support… How do you make it a smooth experience for Capacities’ users? We believe that content types actually help you to get organized. Whenever you create a note, you most certainly know what type it will be: You want to add a q...
Create a network of thoughts with Steffen Bleher and Michael von Hohnhorst co-founders of Capacities
Habit trackers: does tracking your habits actually work?
Habit trackers: does tracking your habits actually work?
We rarely lack good intentions. We want to drink more water, exercise regularly, or meditate every morning. Establishing habits, however, can feel like a struggle, and there’s often a gap between intention and execution. This is why habit trackers are such popular tools to help us stick to our goals. But do they work, and if they do, why do we tend to abandon them?  We are bundles of habits A habit is a behavioural pattern that occurs through repetition. Once established, the behaviour occurs naturally. This is great news for healthy habits, because once your routine is in place, you will automatically stick to it. But habits are not always good: an important downside is that ceasing undesirable actions can be challenging. In his book The Principles of Psychology, the pioneering philosopher and psychologist William James described living creatures as “bundles of habits”, explaining that developing habits “simplifies the movements required to achieve a given result.” He found that habits make our actions more accurate and less tiring to complete. Habit tracking is a way to log all of the times when you behave in a desired way — when you make the right choice, such as eating healthily, writing in your journal, or reading a book. There is evidence showing that tracking behaviour can increase the likelihood that habits will become established, as establishing healthy habits makes it easier for us to repeatedly make the right choices. And such tracking doesn’t have to be tedious. We now have a wealth of technology at our fingertips. This means it is easier than ever to track our habits. Which is fortunate, because the simpler and more goal-oriented the system is, the more successful habit tracking seems to be. Dr Katarzyna Stawarz and colleagues reviewed 115 habit formation apps, comparing app functionality to the cues that naturally help habits become established. They found that apps are most likely to successfully support habit formation if they provide contextual cues or implementation intentions to guide goal-directed behaviour. The benefits of habit tracking One of the key advantages of using a habit tracker is that it allows you to visualise your progress and identify any recurrent setbacks. This form of metacognition can help you adapt your approach and keep on improving your habit formation strategies. Let’s say that you want to drink more water. Once you get started with tracking, you can see exactly when you met your goal of staying hydrated, and when it was harder to stay on target. You may start to detect patterns, such as not drinking as much water at the weekends when you may be out of the house, or during busy work days when you may forget to rehydrate. In seeing these patterns, you can develop new strategies to embed the behaviour, such as setting reminders or ensuring you carry water with you at all times. In addition, using a habit tracker may improve your mental health and motivation. In 2020, Marco Stojanovic and colleagues asked students to log their study patterns. They found that when students used the habit tracker to improve their study habits, they were less likely to experience a bad mood or feel distracted while studying, and were also less likely to wish they were doing something more enjoyable. Across the six weeks, using a tracker increased the habit strength and motivation. One of the mechanisms through which habit tracking can benefit your mental health is by celebrating micro-wins throughout your personal growth journey. If your habit tracker shows that you have eaten healthily for a whole week or that you got eight hours of sleep for three nights in a row, you can experience some sense of accomplishment before the effects of such good habits start to show.  Finally, another advantage of using a habit tracker — especially a digital one — is that it can remind us to act. If you want to drink more water, having reminders throughout the day can help us get used to refreshing our glass or re-filling a water bottle at regular intervals. Even if you are using a good old pen and paper to track your habits, seeing your habit tracking notebook on your desk may act as a trigger. Gradually, as the habit forms, the process will become automatic. However, not all is rosy in the world of habit trackers, and you should not blindly assume that tracking your habits using any method or app will necessarily help you stick to your goals. Some of the aspects that make habit trackers so powerful can also be detrimental to habit formation. The dark side of habit trackers One problem researchers found with habit trackers is that it creates a “habit dependency” in users: you are only sticking to the habit because of artificial support such as reminders and streak notifications, which help with the repetition of a desired behaviour, but tie the habit to in-app triggers. No app, and the habit is gone. Your habit is tied to ongoing app use. Another problem is the over reliance on inflexible technology. Digital self-tracking is quickly overtaking paper-based tracking, but most apps focus on very simple habits that may limit users in their personal growth. Researchers have warned that many habit tracking apps are too rigid to support our diverse practical and emotional needs, and that more flexible, customisable self-tracking apps are required to meet the multidimensional goals and challenges of users. Finally, it’s common for users to abandon an app before the habit becomes established, meaning that the app fails to fully assist with habit formation. As such, installing habit tracking apps often becomes a form of wishful thinking, rather than a productive strategy to build better habits. The key to developing a habit is to find a way to ensure the desired behaviour becomes automatic. For habit tracking to be successful, it must therefore be simple and flexible, as well as encouraging self-control and goal-directed behaviour. The key to successful habit tracking Like many self-improvement strategies, habit tracking needs to be designed carefully if you want to reap its benefits without falling prey to the illusion of productivity. Following three simple strategies will help to improve your chance of success when using a habit tracker. Choose the right tool. You may instinctively know if you’ll prefer a paper or app-based method of tracking. If you love to journal or enjoy putting pen to paper, you will benefit from the flexibility of paper-based tracking. However, if you feel more comfortable using your phone, using a digital habit tracker will make it easier to set up initial reminders and track your progress at any time or location. Whichever method you choose, ensure that it doesn’t get in the way of the habit itself, as habits will only form with long-term repetition. Decide if you are forming or breaking a habit. Habit tracking can be used not only to form habits, but to break them as well. Similarly to forming a new habit, breaking an existing habit will take sustained conscious effort. If you want to make a habit, you should make the most of helpful triggers, such as putting a water bottle on your desk to act as a visual reminder to drink more regularly. If you are breaking a habit, you should instead remove any triggers, and replace the unwanted habit with an alternative activity — such as going for a walk instead of smoking. Apply the “never miss twice” principle. It’s easy to feel discouraged when making or breaking habits, especially when life gets in the way. If you go to bed later than you wanted to, don’t be hard on yourself. Instead, say out loud: “Never miss twice.” Then, the next evening, make sure you have an early night. By avoiding missing twice, you are making space for the inevitable slip-ups, while maintaining the motivation to succeed over the long-term. Establishing good habits and breaking bad habits can improve your life and your work, but it takes dedication for these behaviours to become automatic. Simple and flexible habit trackers can help you visualise your progress, boost your mood, and maintain your motivation. Just make sure that you have a clear idea of the habit you want to make or break, and to focus on long-term consistency. The post Habit trackers: does tracking your habits actually work? appeared first on Ness Labs.
Habit trackers: does tracking your habits actually work?
Protect your focus with Peter Hartree creator of Inbox When Ready
Protect your focus with Peter Hartree creator of Inbox When Ready
Welcome to this edition of our Tools for Thought series, where we interview founders on a mission to help us be more focused and productive, without sacrificing our mental health. Peter Hartree is the founder of Inbox When Ready, a browser extension that helps you check your inbox with reasonable frequency, batch process email on a regular schedule, and minimize the total time you spend in your inbox. In this interview, we talked about the science of behavior change, the relationship between triggers and ability, how to set a productive inbox schedule, the importance of making the most of your “prime time”, and more. Enjoy the read! Hi Peter, thank you so much for agreeing to this interview. Inbox overwhelm is a very common issue. What helped you see this opportunity to improve the Gmail interface?  One morning I was making a website and I needed to ask my client a question. I opened Gmail with the intention of emailing the client, but my attention was immediately derailed by some new, not-very-important messages in my inbox. When I remembered my original intention (perhaps 30 minutes later), I realized that Gmail would be much less of an attentional liability if only my inbox were hidden by default. I hacked a Chrome extension to implement this change the following day. I used it for a few weeks and found it was a big win for my focus. I went from seeing my inbox 10-20 times a day (mostly unintentionally) to just once or twice (when I deliberately chose to). As a result, I found it much easier to do several hours of uninterrupted deep work each morning, before checking my intray. It became clear that this simple change was saving me at least an hour per week, by supporting deep work and making it easy to batch process my email. And what inspired the exact design of Inbox When Ready? Some years before, I’d read about the Fogg Behaviour Model, which says that the easiest way to change behavior is to optimize triggers and ability. Roughly, a “trigger” is something that gets your attention, and “ability” refers to how easy it is to do something. For example, if you put a glass of water on your desk, you’ll notice it frequently (the trigger), and it’ll be very easy to take a drink (the ability). So, with a glass on your desk, you’ll drink much more than if you have to get up to go get water. I had applied the Fogg model to several areas of my life, and been amazed by how powerful it was for promoting behaviors I wanted, and inhibiting those I did not.  With Gmail, I wanted to avoid unintentional inbox processing, and replace it with 1-2 deliberate inbox checks each day. So I needed to reduce triggers (accidentally seeing my inbox) and reduce ability (make it harder to see my inbox). From there it was clear that I should try making my inbox hidden by default, with a “Show Inbox” button to press when I wanted to see it. Later on, I added a “scheduled lockout” feature, which disables the “Show Inbox” button during time periods you specify, reducing “ability” still further. Around that time I had generally been getting interested in attention design as an important subfield of user interface design, partly due to Luciano Floridi’s writing on infraethics. If you’re interested in that, I recommend James Williams’ recent book Stand Out Of Our Light. What are some of the benefits of this approach to email management? In short: more deep work, less distraction. It can also help you feel less stressed. One way to get a lower bound on the benefits: just think about it in terms of time saved. Over the years, hundreds of people have told me they save at least an hour per week due to using Inbox When Ready. The extension is not useful for everyone. You need to get at least 5-10 emails per day, and also have a role where you don’t need to reply to email within minutes. It is used by a lot of CEOs, journalists and consultants, as well as teams who are collectively trying to improve their culture around email.  Some people are initially quite nervous about increasing their average response times, and are surprised when, in fact, their recipients turn out to be perfectly happy so long as they reply within (for instance) 24 hours. We only have so much willpower and I imagine people would still be tempted to check their inbox as a way to procrastinate. How do you address that challenge? Some people develop the habit of automatically clicking “Show Inbox” after a while. To avoid this, about a third of users enable the “lockout schedule” function, which lets you disable the “Show Inbox” button during time periods you specify. It also blocks various message searches which suggest people are trying to circumvent the block. More than half of users also set up a way to get notified immediately when someone sends them an urgent email. Many people do this using Gmail filters and the label-specific notification settings on Gmail for Android. In the last month or so, I’ve been testing an “inbox whitelist” feature that is native to Inbox When Ready. This makes it possible to see messages that match certain criteria even when the rest of your inbox is hidden.  How do you recommend someone set their inbox schedule? Try to design everything around making the most of your “prime time” (sometimes called “power hours”). This is the time of day where you are usually most energized, most capable of highly productive periods of deep work. For some people that’s the first few hours of the morning, for others it is during the evening, or late at night. My prime time is the first few hours each day — one to two hours before breakfast, then two to three hours afterwards. So I write my to-do list the day before, and make sure I can sit down at my desk and get into the most important stuff immediately. I then do three to five hours of deep work before checking any non-urgent inbound channels (for example email and Slack). Speaking of Slack: a lot of people have asked me to make Slack When Ready. I seriously considered this, but it turns out that most people use the native Slack app, and it’s almost never possible to create extensions that customize native apps. I consider this a major and underrated scandal (I’m not alone). I dream that the EU might do something about this one day, but it seems unlikely. Slack When Ready would be very useful indeed. Any other principles you apply when using Inbox When Ready? I have a lockout period scheduled from 4am to 11am. That stops me seeing my inbox during my morning prime time. I usually check my inbox for 30 to 60 minutes lunch, then update my plan for the afternoon accordingly. There’s also a “Hide category tab notifications” feature, which I always keep enabled.  That does look a lot less distracting. And finally… What’s next for Inbox When Ready? As I said, I’m testing a new “Inbox Whitelist” feature. If it proves popular, it’ll be released to all users sometime soon. I’m also testing a version of Inbox When Ready for Microsoft Outlook. Generally I’ll keep maintaining the extension, and gradually improving it in response to user feedback. I don’t expect to add lots of new features — I want to stay in the sweet spot of “highly effective” and “very easy-to-use”. The extension will only ever be used by a tiny fraction of the more than one billion people who use Gmail each month. My dream outcome would be to influence the designers of Gmail or, more realistically, a more specialist client like Superhuman. I have been in touch with several people about this, but no dice so far. Finally, I recently started work on a new browser extension for Google Docs. The extension helps people search and review comments much more quickly than the native comment browsing features. This is very handy if you often work on 50-page documents that have hundreds of comments and suggestions. I’m calling it Comment Helper for Google Docs. Thank you so much for your time, Peter! Where can people learn more about Inbox When Ready and give it a try? It takes 30 seconds to install the extension. Within the first few days, most people sit down and think a bit about the ideal workflow they want to target, then configure the extension accordingly. I have some quick advice on how to do this. On the fence? Check the 1500 or so 5-star reviews. I’d love to hear how you get on. You can reach me on Twitter at @peterhartree or via email. The post Protect your focus with Peter Hartree, creator of Inbox When Ready appeared first on Ness Labs.
Protect your focus with Peter Hartree creator of Inbox When Ready
How to switch from Evernote to Roam Research
How to switch from Evernote to Roam Research
Although Evernote has been the gold standard for digital note-taking for many years, new alternatives offer powerful features that are more akin to how the mind works. Gone are the days of plain old notes and cabinet folders. A new era of networked thinking has ushered to help us connect and generate original ideas. If you are thinking of switching from Evernote to Roam, read this tutorial to explore key considerations and learn how to migrate. Why you may want to switch from Evernote to Roam With both apps offering a different approach to note-taking, it can be hard to decide if switching is worth the hassle. Here are some reasons you may want to switch from Evernote to Roam. Creativity Roam offers bidirectional linking, where you can link your notes together and reference them. However, Roam also allows you to link notes that you have previously not connected, called unlinked references. These features can help you generate new ideas by noticing recurring patterns and keywords. Compared to Evernote, this makes Roam a more suitable note-taking app for writers, researchers, and entrepreneurs. Frictionless note-taking Traditional cabinet-based note-taking apps such as Evernote require you to think about where you want to place the note before writing it, and this extra friction may make a habit of writing notes harder to build. When you open Roam Research, you are immediately greeted with the Daily Notes page, and it acts as a scratchpad where you can get down to writing immediately. Because of bidirectional linking, you can resurface these notes organically when needed, removing the need to organize your notes in the first place. It is a frictionless way of taking notes that allows you to focus on the work instead of the logistics. Journaling Compared to Evernote, Roam makes it easier to start a journaling habit. Simply document what happened on the day on the Daily Notes page. Valuable things to track include your mood, people you met, your goals, and your habits. You can then track the things you wrote about using linked references and look at the patterns emerging from your daily entries. This is different from journaling on paper or other apps, where you simply store your journal entries without having any way to analyse the data and identify recurring themes. Thanks to its built-in tickbox and timestamp slash command, you can also use Roam for interstitial journaling. Research Roam’s sidebar feature allows you to work on one main note and open another note in the sidebar by shift-clicking the note or the block. You can use this to open multiple notes and scroll between them to refer to your notes as you research quickly. With the sidebar, you can jump between different notes without going back and forth between windows or tabs. This feature is unavailable in Evernote, where you must open multiple windows to achieve the same purpose. Why you may not want to migrate from Evernote to Roam As you’ve seen, Roam offers many benefits compared to Evernote. However, Evernote is still a powerful note-taking app. If these features are non-negotiable, you might want to stick to Evernote. Search functionality When searching your notes, no app comes close to Evernote. Thanks to a great search view, and optical search recognition (OCR) that allows you to search for scanned documents and PDF files, Evernote is an excellent app for capturing and searching everything from receipts to images. While Roam has queries, its search is only limited to text. Web clipper You can save web pages in Evernote by using the mobile or desktop web clipper. It creates a copy of the web page, and you can save it to your selected notebook. With this, you can also use Evernote as a read-it-later app. On the other hand, Roam does not have an official web clipper, and you often have to turn to community extensions that are not as powerful as Evernote’s web clipper. PDF Annotation If your work involves using many PDF files, you should stick to Evernote. While there are some workarounds for working with PDF files in Roam, Evernote allows you to embed and annotate your PDFs directly. Integrations As Evernote has a public API, there are many workflows that you can connect it with. At the time of writing, Roam does not have a public API and has very few integrations. How to migrate from Evernote to Roam Let’s start with the bad news. Roam does not offer a way to import into Evernote directly, and Evernote does not export to Markdown. However, not all is lost: importing all your notes from Evernote to Roam is still possible. For this purpose, you need Notion as a “bridging app” to help convert your notes into Markdown. 1. Import your Evernote database into Notion First, create a free account with Notion and import your notebooks into Notion. To do that, click on the Import button on the left sidebar, authorize the connection between Notion and Evernote, and select the notebooks you would like to migrate. Notion will then import your selected notebooks into a database. 2. Export from Notion as Markdown and CSV Next, click on the three dots in the top right corner and select Export. Select the Markdown & CSV option, and switch on “Include Subpages”. Notion will then download your database into a zip file containing your notes in Markdown and all your images. 3. Import into Roam via Markdown Head over to Roam, click on the three dots in the top right corner, and click “Import Files”. Select the Markdown notes from Notion. Roam will then show an option to rename your files before you import them into Roam.  4. Transfer images Create a page in Roam called Evernote pictures, and upload all your images to this page. You can then copy and paste or reference the pictures when you find a broken image link in your note, and the sidebar function in Roam makes it easier for you to do this. Instead of importing your notes in bulk, an alternative method is to be selective about the notes you migrate to Roam. Take the time to consider the notes you use the most and the notes that are most important to you. Migrate these notes only. As you go, keep on only migrating other notes when you need them. By doing this, you can immediately benefit from Roam’s features without going through the lengthy process of migrating everything in one go. Getting used to Roam As you have seen, Roam is a powerful note-taking tool. However, it has a steep learning curve and getting used to its features might take some time. Here are some things to take into account when switching from Evernote. Use backlinks. To create a link to another note, simply type a [[]] double bracket or hashtag and type in the name of the page you would like to link. You can also do this on the block level with double parentheses (()). After making these links, you can find the notes linked to it with the linked references section under each page. Here, you can further use filters to organize your references.  Use Unlinked References. As mentioned above, you can easily connect your notes by typing double brackets and looking at the connections in the Linked References section. However, Roam also allows you to look at Unlinked References, where you can search for references to the current page you did not link and link it with one click. This allows you to find connections you did not think of as you explore your database.  Keyboard shortcuts. Keyboard shortcuts are one of the best ways to speed up your Roam workflow. Research suggests that using a keyboard shortcut to do a task takes half the time of doing the same task with the graphical interface. Some calculations also show that not using keyboard shortcuts costs you eight days per year. Read our cheat sheet and guide on Roam hotkeys to learn more about keyboard shortcuts. Use Markdown. Roam uses Markdown to edit your notes, while Evernote uses the WYSIWYG editor. Familiarise yourself with the Markdown syntax in Roam so you can type and format your notes faster.  Choosing between Evernote and Roam depends on what you need your tools for thought to help you with. If your note-taking style is that of a librarian, you should stick to Evernote. If you think exploring and having a gardener’s approach to your notes can help you, consider switching to Roam. To learn more about using Roam, join our Roam support group in our community. P.S. Want to learn how to make the most of Roam? Join Roam Essentials, a short course to master 20% of the features that will unlock 80% of Roam’s power. The post How to switch from Evernote to Roam Research appeared first on Ness Labs.
How to switch from Evernote to Roam Research
The arrival fallacy: why we should decouple our happiness from our goals
The arrival fallacy: why we should decouple our happiness from our goals
“When I achieve this goal, then I will be happy.” If you’ve ever experienced such a when/then thought pattern, you’re not alone. Whether you’re aiming to run a marathon, get a promotion at work or buy your first house, having a goal in mind can increase your motivation. However, we often mistakenly believe that achieving our goals will make us happy. That tendency is called the arrival fallacy. It usually goes like this: upon meeting a goal, you will initially feel delighted. But, very quickly, you find yourself back at your usual level of happiness, or even facing a sense of emptiness. The disappointment of not experiencing the expected happiness, or only experiencing it briefly, can subsequently impact your well-being. Instead of falling prey to the arrival fallacy, it’s crucial to reframe your goals so you can avoid an anti-climax. The short-lived nature of goal-based happiness The arrival fallacy was first coined by Harvard-trained psychologist Dr Tal Ben-Shahar in his book Happier: Can You Learn to Be Happy? As a young elite squash player, Ben-Shahar had a recurring belief that if he could win a match or a tournament, he would experience happiness afterwards. However, though he would indeed feel happy upon winning, this feeling was short-lived. Once the euphoria faded, he found himself faced with stress, pressure, and a feeling of emptiness.  Rather than creating the long-lasting happiness or contentment he expected, accomplishing one goal simply led to a new sporting target being uncovered. Once a target had been hit, new goals appeared on the horizon. His list of goals was never fully completed. Dr Maya Pilin further explored our ability to predict our future emotions. Pilin reported that our affective forecasting, or our ability to imagine how something will make us feel, is often inaccurate. This systematic inaccuracy is concerning, because being able to predict how we will feel is essential to our decision-making process. So why are we so bad at predicting our happiness levels? Psychologists Timothy Wilson and Daniel Gilbert found that predictions about how a future event might make us feel are often flawed because of the impact bias. The impact bias leads to an overestimation of the duration and intensity of the positive emotions you may feel as a result of an event. We overestimate the positive impact the accomplishment of a goal will have, and we underestimate how other events or feelings may influence the way we feel. Let’s say that after many years of work, you finally get your dream role at work. Sure, you’ll be happy for a short while. But, despite reaching your goal, more senior career ambitions will appear, and you’ll assume that achieving these will lead to more happiness. In addition, happiness cannot hinge on one facet of your life. Other aspects including your health, relationships and finances will affect your mood as well. The impact of the arrival fallacy We have established that we fall prey to the arrival fallacy when we believe that achieving our goals will make us happy. And that’s not without consequences: the arrival fallacy affects us in many ways by impacting future decision-making and emotional well-being. For instance, if the initial euphoria of completing a marathon disappears after two days and leaves you feeling empty, you may conclude that striving to excel is not worth the hardship. Alternatively, you may immediately start exploring new challenges to try to reach that short-lived peak of happiness again, without fully considering potential consequences. Dr Adam Dorr explains that the disappointment associated with the arrival fallacy stems from seeing “possible futures as static objects (a destination or goal) instead of as snapshots of an inherently dynamic process.” Happiness is not a static destination that can be reached after achieving a goal. While achieving a goal may give you a short-term boost, your levels of happiness will continue to rise and fall in accordance with the many internal and external events you experience. Even highly-educated people fall prey to the arrival fallacy. A study found that assistant professors commonly made the prediction that receiving tenure would strongly influence their long-term happiness. However, when this prediction of happiness was later checked, it was found that there was no significant difference in happiness levels between those who had been and those who had not been awarded tenure. Similarly, many individuals believe that life would be better or more enjoyable following a big lottery win. However, Dr Philip Brickman and colleagues found that major lottery winners were not any happier than control subjects who lived close by. Worse, lottery winners took “significantly less pleasure from a series of mundane events”, suggesting that the winners’ emotional well-being may have been negatively impacted following the arrival of their wealth — a common occurrence when we tie our happiness to the achievement of a goal. How to manage the arrival fallacy While achieving a goal may lead to an initial burst of endorphins, the following slump may either cause disappointment that the effort you put in has not paid off in the way you anticipated, or a frenzy to move onto a newer, bigger, more exciting goal. You may think that goal-setting is flawed and that you may as well live life without setting any goals, but that would not be the best strategy. Although completing a goal may lead to the arrival fallacy, Dr Tal Ben-Shahar maintains that having objectives is essential to personal growth. The trick is to repackage your motivation to change your perspective, making the process of achieving your goals as important as the result, thus helping to avoid an anti-climax upon crossing the finish line. Here are three strategies to help you avoid the arrival fallacy: Avoid when/then happiness projections. If you find yourself saying “I will be happy when I [move abroad, have a baby, receive tenure]”, you are putting unrealistic pressure on the goal to contribute to your long-term mental well-being. Assigning intense expectations to the completion of a goal may leave you feeling disappointed. Rather than using when/then projections, practice mindfulness and take note of what currently makes you happy. Rather than hoping for happiness upon reaching your goal, proactively look at the positives in your life right now. This exercise can be done through journaling or meditation. Focus on the journey, not the outcome. On your way to achieving a goal, enjoy the process so that you are not only hoping for joy at the finishing point, but instead experiencing it at each step of the process. Give yourself the space to experience the joy of learning, connecting with experts in your field, building a new feature, giving a great presentation, finding a solution to a complex problem. Learn how to learn, think about thinking, and develop skills such as creativity and decision-making. This way, whatever the outcome, the journey will have been worth the work. Celebrate the micro-wins. Minor milestones can act as catalysts for bigger tasks. A micro-win might include getting a call scheduled with a new client or running one mile for the first time without stopping. Focusing on smaller wins will make you feel more productive and happier, even though you have not yet reached your long-term goal. Celebrating the micro-wins puts less pressure on the achievement of the main goal, allowing you to experience sustainable happiness instead of short-lived bursts of joy. Setting goals does help propel you forwards, but relying on them for your happiness can make you fall prey to the arrival fallacy, which will negatively impact your well-being and decision-making processes. Rather than relying on unrealistic when/then projections, celebrate the aspects of your life that already bring you happiness, and enjoy the ongoing process of learning and personal growth. The post The arrival fallacy: why we should decouple our happiness from our goals appeared first on Ness Labs.
The arrival fallacy: why we should decouple our happiness from our goals
Availability bias: the tendency to use information that easily comes to mind
Availability bias: the tendency to use information that easily comes to mind
As humans, our ability to make the right decisions is limited by the many constraints of our mind. One such constraint is the availability bias — our tendency to make judgments based on previous experiences that are easily recalled. When some piece of information is easily brought to mind, we incorrectly assume that it’s an accurate reflection of reality. This cognitive bias often leads to the illusion of rational thinking and, ultimately, to bad decisions. The science of the availability bias In 1955, Dr Herbert Simon formulated the notion that memory limitations can affect decision making. Simon further elaborated that it’s not possible for humans to consider every piece of relevant information. Instead, we focus on the data within our minds that’s easily available and thus seems to be the most pertinent. Simon’s research opened the door to the modern examination of decision-making processes, and the shortcuts we utilise to reach conclusions. The term “availability heuristic”, another term for the availability bias, was later coined by Dr Amos Tversky and Dr Daniel Kahneman in 1973. They described the natural human tendency to assume that examples we can readily think of are more relevant than they truly are. For example, if you’re looking for a new note-taking app, you might go for a particular tool because you recall that a friend recently raved about it. Or, if you read about a plane crash in the news a week before you are due to fly for work, you may overestimate the likelihood of your own plane crashing. When evaluating colleagues, managers may remember the one incident in which a team member accidentally caused a major delay to a project, without recalling the many other days in which the colleague worked without issue. The availability bias may lead to an unfairly negative view of the colleague. Tversky and Kahneman wrote a series of papers examining biases used in judgement under uncertainty, and their research offered insight into the cognitive processes that explain human error. In their own words: “Availability is an ecologically valid clue for the judgement of frequency because, in general, frequent events are easier to recall or imagine than infrequent ones. However, availability is also affected by various factors which are unrelated to actual frequency. If the availability heuristic is applied, then such factors will affect the perceived frequency of classes and the subjective probability of events. Consequently, the use of the availability heuristic leads to systematic biases.” The impact of availability bias on decision making The availability bias may significantly impact your day-to-day decisions in both your professional and personal life. A study by Ping Li and colleagues highlighted that even doctors can misdiagnose patients as a result of the availability bias. Their study demonstrated that recent experience of a health condition makes it easier to recall, and therefore increases  the chance that a subsequent patient might be misdiagnosed with the same illness. Dr Valerie Folkes explains that the availability bias can also influence consumer’s beliefs about perceived risk. As part of her research, Folkes confirmed that our recent experience of a product becomes part of our internal decision-making data. For example, across one week you may count all the times your smartwatch fails to automatically sync to your phone to deduce an estimate of its failure rate. However, this estimate would be based only on those recent failures to sync — which are easily recalled — without taking into account the many times it has synced without issue during the previous year. The easy access to recent data within your mind makes it seem like this information is important and accurate. As a result, the perceived smartwatch failure rate is higher than reality. But that’s not all. Dr Norbert Schwarz and colleagues found that the availability bias can also impact self-evaluation. When asked whether they were assertive, study participants who only needed to list six examples of their assertive behaviour believed that they were assertive. But those who had to think of 12 examples found the exercise much harder, and concluded that they were not assertive. Their self-evaluation was based on how easy the recall felt — a typical example of the availability bias at play. The availability bias can also impact the way that you feel about education. Dr Craig Fox wrote in 2006 that difficulty of recall can influence the way you evaluate a course. If you’re asked only to provide two pieces of negative feedback, you may conclude that the teaching you’re receiving is poor because it’s easy to think of just two negatives. However, if asked to list ten pieces of negative feedback, the task is far harder, and you’re likely to conclude that the course must be good. As you can see, the availability bias impacts our decisions in many areas of life and work, which can lead to bad decision making. Fortunately, there are ways to avoid its worst pitfalls. How to manage the availability bias To make it easier and quicker to make decisions, our mind applies shortcuts. Some of those shortcuts lead to cognitive biases, such as the availability bias. While it’s not possible to completely overcome this cognitive bias, there are several ways to manage it. Practise deliberate brainstorming. Instead of going for the most obvious solution, which is likely to be heavily influenced by your most recent experiences, conduct research and generate as many potential solutions as possible based on factual data. You can practise deliberate brainstorming on your own or with your team. This not only helps to manage availability bias, but may also lead to the generation of innovative solutions that you may not otherwise have considered. Try “red teaming” ideas. Red teaming is similar to playing devil’s advocate. It involves rigorously challenging ideas and assessing ideas from an opposing point of view to discover flaws or shortcomings. The aim is to avoid making unsound decisions and to mitigate potential adverse events. As part of red teaming ideas, decision makers should explore alternative solutions, interrogate the underlying facts, and try to view the decision from an impartial point of view to thoroughly test the integrity of a decision. Use self-reflection methods. Self-reflection can take the form of journaling, talking out loud to yourself, or thinking deeply about a decision while away from distractions — for example while taking a walk. Taking time to reflect on decisions prior to executing them allows for a delay that will reduce the power of the availability bias, so that other ideas have the necessary space to surface. While the mind’s shortcuts make decision-making quicker and easier, they can lead to less-than-ideal solutions. As you have seen, the availability bias may lead to poor personal and professional decisions. While it’s impossible for this cognitive bias to be completely overridden, you can avoid some of its most negative effects by brainstorming, red teaming ideas, and implementing a self-reflection practice. These strategies will help you take a more objective view of the information available to you. The post Availability bias: the tendency to use information that easily comes to mind appeared first on Ness Labs.
Availability bias: the tendency to use information that easily comes to mind
Capture the big picture with Maks Kuchur founder of xTiles
Capture the big picture with Maks Kuchur founder of xTiles
Welcome to this edition of our tools for thought series, where we interview founders on a mission to help us be more productive and more creative, without sacrificing our mental health. Maks Kuchur is the founder of xTiles, a visual tool to organize information with card-based pages. At the intersection of a white-board and a note-taking app, xTiles’s goal is to help users see the big picture and structure content to free up time and energy for creativity. In this interview, we talked about the importance of flexibility and simplicity for creative thinking, how to use building blocks to assemble a thought puzzle, how to go from messy thoughts to structured ideas, how visual thinking can help find better solutions, and much more. Enjoy the read! Hi Maks, thank you so much for agreeing to this interview. xTiles offers a unique approach to note-taking at the intersection of structured writing and free-form visual thinking. What inspired you to build such a note-taking app? In the digital era, we have switched from paper to computers and mobile phones as storage for our thoughts and ideas. Note-taking apps are some of the most popular tools. Google Keep and Apple Notes are examples of simple apps, which work beautifully over the ecosystem and on all devices. Fifteen years ago I was amazed by Evernote. I was waiting for every new version and liked every feature that was released. But times have changed, and people need more from note-taking. Nowadays, we take notes not just to save an idea or to not forget a thought. It’s a way to start working on a new project or a new task. So we not only need tools to store ideas and thoughts; we need tools to develop our thoughts or move our ideas from inception to execution. Such tools should offer a creative flexible space to get things done, things that start from just a single thought in the form of a simple note and bloom into a successful project. At some point, I realized that I always had to switch between different apps to develop my ideas. I started with a note, moved to a whiteboard or a mind map, then to some task-management system, and finally to a spreadsheet or a document. I didn’t use all of the features of all these apps, but I also couldn’t get everything done with just one app. I tried to find a solution. But most of them were quite complicated. I wanted a more visual tool, with more flexibility. In the physical world, we can move documents on a table, put stickers on a board, and mix and match some pieces of paper. People should be able to do the same on a computer screen, but it is challenging to implement. Whiteboards work by constantly zooming in and out, documents by scrolling vertically, and spreadsheets by jumping around cells. These tools are not designed to help us develop new ideas. They restrict us with their UX, they don’t allow us to draw a picture of what we have in our mind, and they add unnecessary cognitive load. With xTiles, my goal is to create a space where people can process their ideas from early formulation to successful execution, without limiting their creativity or wasting their time. It’s like a quiet assistant working in the background while you express your creativity. That’s my goal with xTiles. A common problem with note-taking apps is that people dump all of their thoughts in there and never manage to turn them into actionable knowledge. How does xTile address this challenge? You are completely right. I had the same problem — I can’t call myself a very well-organized person. A big problem is that most note-taking tools aim to capture and organize thoughts, but not to develop them. Creative specialists need tools that can adapt to their work; they need to have the freedom to restructure information during the thinking process. That’s the intention with xTiles. The three main strengths of xTiles are that it’s visual, flexible, and simple. Instead of figuring out a solution in their head, many people find it more effective to unload their thoughts into a visual shape, which helps them clarify the problem at hand. xTiles combine the best from whiteboards and note-taking apps, with a vertical canvas as the basis to build a pager, and a set of building blocks to assemble your thought puzzle: tiles and tabs where you can place blocks, such as links, files, images, videos, and many other types of widgets. It’s easy to learn, while still allowing you to create a granular picture of your thoughts. Another strength of xTiles is its flexibility. Our mind works by experimenting with different solutions, which in xTiles is reflected as drag-n-drop operations on the page. xTiles doesn’t restrict your creativity. You can move blocks between tiles and tabs, tiles between tabs and pages, and pages between workspaces. This allows your ideas to develop over time. Finally, xTiles has a clear and simple interface. All the magic happens in the background, and users don’t need to spend a lot of time learning how to use it in order to get started. It’s all about the speed and smoothness of getting results. In xTiles, people can quickly make a Kanban board without thinking of databases: they can just use tiles to compose a view of several columns, fill it with data without extra formatting, and move one with capturing and processing their thoughts before they forget them. That is simplicity. In xTiles, users can execute more easily on their ideas without any extra cognitive load. Visual, flexible, and simple — that sounds amazing, but also very ambitious! Can you walk us through your process of going from messy thoughts to structured ideas? There are four steps to going from messy thoughts to structured ideas, which are: capturing, organizing, distilling, and reusing. The first step is capturing, I jot my thoughts down and collect additional data from the Internet. Starting from a new page, I create one or several tiles, and begin to write things down, copy-pasting data from relevant websites, using the web-clipper or the mobile app. This first step overlaps with the next step — organize — because I can group blocks within tiles and tabs. The second step is organizing. I categorize and group blocks within tiles, and tiles within tabs. I mix and match data to give me the ability to structure information and sync the picture with what’s going on in my mind. For example, one tile could be left, another right, one could be wide, another one narrow, and I can use colors and decorative features to emphasize major aspects. The third step is distilling. I personally like to create a one-pager view. I’m deeply convinced that one screen should contain enough information to help me find the solution to any problem. So I always try to get rid of any superfluous details from a page. However, details are also important. xTiles provide you with sub-pages, nested tiles and tabs, and collapsed tiles to make your page clear and to keep details on the background. You constantly have a “big picture” of your project, but you can deep dive at any moment. Finally, the last step is reusing. It is a step to start a new project or task, but not from scratch. If you invest some time and data into xTiles, then you can use your knowledge and ideas as the foundation to create something new. Creating a new page from existing data is easy and quick. It’s not like the usual copy-paste, it’s more similar to making a model in LEGO. So, that’s how you go from lots of scattered pieces of information to a “big picture” — but how do you make that structured information actionable? xTiles provide many features to make information actionable, such as checklists, reminders, due dates, mentions, and other features for task and project management. But I think the real power of xTiles resides in the environment that it creates for the user. Being productive means you should always consider the context you are navigating. Efficiently accessing relevant information is critical for producing effective solutions and getting things done. Actionable information is information organized in a way that allows you to make the right decisions. Having quick access to information will help you build a comprehensive picture of the situation. The more flexibility you have in playing with that “big picture”, the more alternative solutions you will be able to explore. Visual flexibility matters too. If your information is more visual, you can share your ideas and collaborate with others more efficiently. xTiles allows you to see the “big picture” while quickly navigating your workspace. Our visual search in a tile view is a powerful tool to find an entry point to your information space and to start browsing. Nesting features could hide details when you don’t need them and show them when they are helpful. This is very comprehensive. What kind of people use xTiles? We are a horizontal tool that can be useful for a lot of kinds of people. The two main traits of our users is that they need to organize their knowledge, and that they are visuals. Some of our users include writers, content marketers, consultants, advisors, tutors, teachers, entrepreneurs, and other types of experts. Many of them use xTiles as a primary note-taking tool and personal knowledge management tool. Everyone has their combination of use-cases. It could be writing, brainstorming, researching, creating story-boards and mood-boards, organizing lessons, etc.  Our ideal user is a person who wants to be organized, but doesn’t have much time to invest, loves to compose ideas from bricks like a puzzle, and needs to keep their ideas and projects in one place. What about you… How do you personally use xTiles? xTiles is my personal place of truth. This is where my ideas are born, where my projects develop, where I brainstorm and conduct research, where I reuse and combine my thoughts. As a team, we also use xTiles as a place to gather all relevant information for our projects. And finally… What’s next for xTiles? We are cur...
Capture the big picture with Maks Kuchur founder of xTiles
Thinking faster to create more with Andrew Nalband founder of Thunk
Thinking faster to create more with Andrew Nalband founder of Thunk
Welcome to this edition of our Tools for Thought series, where we interview founders on a mission to help us become more productive and more creative without sacrificing our mental health. Andrew Nalband is the founder of Thunk, a daily thinking tool to get new ideas, save time, and get more done. In this interview, we talked about the shift from “personal digital assistants” to “personal knowledge management”, the associative nature of our brains, the power of beauty and delight, how to stay focused and avoid distractions, how to manage our inner critic, how to design a morning routine, and much more. Enjoy the read! Hi Andrew, thank you so much for agreeing to this interview. First, Thunk is a young but gorgeous product — can you tell us more about the team behind it? Thanks for saying that! We care a lot about making a beautiful tool. I get excited when I hear that people recognize how carefully we’ve considered our design choices. We are a tiny team of two and a half people and a few key contractors. On the development team, Sergey does the majority of the work and Ian works with us part time. They do the building.   We’ve hired a few wonderful visual artists to create our illustrations and icons. I fill in the rest of the work: the UI design, product management, web design, copy, everything on our social media — those things are coming from me. Right now, that’s the whole team. Small, but mighty! What exactly inspired you to bring Thunk into the world? The best way to describe our inspiration is to start in 2007. I was working my first job out of college — data entry at a pharmaceutical company. Every day, I would get up and enter tax forms into crappy financial software. Candidly, I was miserable and looking for a way to distract myself. I was fascinated by the hot productivity tools at the time: PDAs or “personal digital assistants”. If you’ve been around the space for a while you’ll remember these devices. They had ugly, complicated interfaces and came with huge printed manuals. You couldn’t perform basic tasks on them without reading the manual. One day, I got a tool that had a huge impact on me: the iPhone. Unlike PDAs, it didn’t need a manual. It was beautiful and easy to use. It really inspired me, so I made one of those silly novelty apps that were all the rage when the iPhone first hit the market, a fake shaving app called MyRazor. It became a top ten app across Europe after I released it, and got 2 million downloads. That experience got me really excited about making fun, beautiful software. Building something that people all over the world used and enjoyed changed my life. I’ve never been the same since. Fast forward to now, and I’m still a productivity nerd. “Personal digital assistants” have been replaced with “personal knowledge management” (PKM). The terms are different, but I see all the same problems. The interfaces are ugly and complicated. Manuals have been replaced with online courses. People want to use modern tools, but they’re confused. I want to use modern tools too, but most aren’t carefully designed. Beauty and ease-of-use aren’t prioritized. How can you be productive when your thinking tool is complex, ugly, and cumbersome to learn? I wanted to make a modern thinking tool where beauty and ease-of-use were highly valued. That’s why I founded Thunk Notes. And how does Thunk work? Thunk is in a class of tools that I call connected thinking tools. There are three main features that are important:  Daily notes Templates Backlinks Let’s start with daily notes. Every time you open up Thunk, you get a fresh, blank daily note. You don’t have to worry about where to put things before you start. You can just open it and write.  Most people want to start the day with a little bit of structure. That’s where the second important feature comes in — templates. We built a way to create a template that shows up at the bottom of your daily note. Each day you get to decide if you want to start with a blank slate, or use your daily template. Templates can be used for any kind of note. Most people take notes on the same handful of things: books, people, meetings, projects, tasks. Thunk will allow you to create as many templates as you like so you don’t have to start from scratch every time.  The third important feature in this category is a special link called a backlink. Backlinks are really useful for two main reasons: they allow you to track your thinking about something over time, and they allow you to associate ideas and see relationships in your notes. For example, I have a friend named Ryan. We talk to each other periodically throughout the week. Sometimes Ryan gives me suggestions for Thunk, or recommends a book. When I mention “Ryan” in my notes, it creates a backlink to him and saves that snippet of text to my note about Ryan, and when I open that  “Ryan” note, I can see all of my interactions with him, which is awesome! At a glance, I can see things like: when did I last talk to Ryan? What did he say about Thunk last time we talked? What was that book he recommended to me? This helps me track my relationship with Ryan over time. It also helps me to understand how my ideas relate to one another. In the example where he recommended a book, I can also create a backlink for that book. Once I do that, it will again automatically create a note for that book and any time I mention the book in my notes it will compile that information to the book note. And because the first mention of that book is when Ryan recommended it to me, in the book note I can see that Ryan recommended it. On the Ryan note I can see the book. And on my daily note from the day I bought the book I can see all of that. This really helps because our brains are associative by nature. Note taking is linear, but thinking is networked. Backlinks have created a new way of thinking with a computer that is more aligned with how our brains work. What impact do you hope a more intuitive approach to personal knowledge management can have on the world? We live in a time when there is incredible information abundance. We have access 24/7 via our cell phones. Having all this information has created a new problem: how do we sort through it all? We need ways to process all of this information. To help us improve our thinking, make sense of our work, and make sense of our lives. Good tools can help us. Tools can help us remember, connect, and explore. Tools can help us store our ideas, and come up with new ones. Tools can help us think. That’s what this is really about: better thinking. I believe this need for better thinking is at the heart of the current explosion in note taking apps. We’re at an inflection point where software is enhancing human thinking, and it’s desperately needed. This is the culmination of a vision that goes back to the invention of the personal computer in the 80s. It’s an exciting time, but we are facing a serious problem: complicated tools that aren’t making it easier to think. These tools aren’t caring for our time. I see comments all the time from people that say “I just tried this app and I’m really confused. Everyone seems very excited, but it doesn’t make sense to me. I’m taking an online course, watching YouTube videos and reading a book. I’m still trying to build the right system before I start.”  People are lost in an endless system-building quest. It’s not their fault. As tool makers, it’s our job to make these tools learnable. We can have a world where managing all of his personal knowledge is fun and enjoyable—where we feel invited to sit down every day and work. That is why it’s so important to build a beautiful tool that’s easy to use. It helps us think. It invites us to think. If we can build a tool like that with Thunk, we can inspire the people who use them. We can show them why these things matter. We can help people spend more time thinking and less time preparing to think.  If we can help people make sense of the world we can make a better world and that’s what it’s all about. A lot of knowledge workers are perfectionists. How does Thunk address this challenge? Perfectionism is a problem that I struggle with as well. It’s kind of a double edge sword, because in some ways there are benefits to perfectionism. It pushes you to strive harder. It’s part of what makes me want to wake up every day and work to improve Thunk. On the other hand, perfectionism can be destructive. I’ve noticed this in my work when things are falling short of perfection and are causing unnecessary stress. One of the ways Thunk helps with perfectionism is by helping people be more intentional. For example, the Focus Mode pulls away all the distracting elements and lets you get down to work. Another thing we do to try to help perfectionists is to offer a beautiful place to hang out, where you don’t have to spend a lot of time setting things up and customizing things. You can get just down to the business of writing, which we think is the most important. What are other ways Thunk can help people stay more focused? Beyond Focus Mode, we do a lot of other little things to try to help people stay more focused. One of the big ones is the ability to fold everything. You can fold headings and you can fold any kind of list, whether it’s a numbered list or a bulleted list. A lot of other apps do this thing where you have to decide before you create the heading or before you create a list if you want to fold it or not. We think that this kind of upfront friction is unnecessary, so we try to avoid it. Can you tell us about your mascots, Julian and Nigel? Our mascots are kind of a funny story. When we started Thunk, we were focused on journaling. I was reading this book called The Artist’s Way by Julia Cameron. In the book, Julia describes a practice called morning pages. The goal is to write three pages of stream of consciousness thought first thing in the morning. To do this well, you want to quiet...
Thinking faster to create more with Andrew Nalband founder of Thunk
How to switch from Evernote to Notion
How to switch from Evernote to Notion
Evernote has been the go-to tool for digital note-taking for years, but recently a new app has been making headlines thanks to its advanced functionalities and team collaboration: Notion. With the promise of unlocking the power of an all-in-one workspace, many have contemplated switching from Evernote to Notion. If you are considering making the switch, read on to explore why and how to migrate. Why you may want to switch from Evernote to Notion While both apps offer ways to capture and organize notes with desktop and mobile devices, this only scratches the surface of their features. If we were to explore deeper, there are some significant differences to explore before making the switch from Evernote to Notion. Templates Constantly rewriting the same elements is a waste of time and energy. Templates are a way to save time and effort when writing your notes. With just one click, you can replicate the structure of a note and focus on writing the note. While templates are available in Notion and Evernote, Notion makes it easier to apply and look for templates and offers a more extensive selection of templates to import into your note-taking system. Markdown editor Notion allows you to use Markdown syntax to format your notes, which can help you stay in the flow while writing your notes. In contrast, Evernote uses WYSIWYG editors, which may disturb your writing flow, as you need to take your hands off your keyboard when writing. With Markdown, you can type and edit the formatting with the keyboard. Not only that, but Notion utilizes slash commands and dropdown menus to help you stay in flow when writing your notes. Infinite hierarchy In Notion, you can turn a page into a subpage and make a subpage for that subpage. This allows for the creation of an infinite hierarchy of notes. In contrast, Evernote limits you to stacks and notebooks, limiting how much control you have over organizing your notes. Real-time team collaboration With Notion, you can work on a document with your team members in real-time. You can share your pages with other users and invite them to collaborate in your database for long-term projects. Notion has great team collaboration features such as kanban boards, calendar events, and assigned tasks. While you could share your notes with others in Evernote, it is best to use Notion for teamwork for its expansive team features and ease of collaboration. Why you may not want to switch from Evernote to Notion As you can see, there are some major differences between Evernote and Notion, but it doesn’t mean you should fall prey to the shiny toy syndrome. Notion offers more flexibility than Evernote but may not have all the features you need. If these features are useful, you might not benefit from switching to Notion.  Search functionality When navigating your notes, Evernote’s search feature is more powerful than Notion’s. With advanced search modifiers and operands, you can create custom search parameters to look for the exact information you are looking for. Not only that, but Evernote also has optical character recognition (OCR) capabilities so that you can search text inside handwritten notes, PDFs, images, and Office docs.  Document scanning With the Evernote mobile app, you can capture photos and handwritten notes on the go. It’s as simple as taking a picture with the app, and a snapshot of the document will be taken and saved to your Evernote. This makes Evernote a great app to go paperless, as you can scan your receipts and documents and find them with Evernote’s OCR functionality. Offline access With Evernote, you can access your notes even when you do not have an internet connection. Notion is a web-based app that relies on syncing with its servers to access your notes. Unless you have preloaded your notes, you cannot access your pages when you are offline or if you have a weak Internet connection. If you often find yourself working offline, you should write your notes on Evernote instead. PDF Annotation With Evernote, you can annotate your PDF documents. Whether you want to add a drawing, shape, or text, Evernote allows you to annotate your PDFs directly. On the other hand, Notion requires some workarounds to annotate your PDFs. How to migrate from Evernote to Notion Migrating from Evernote to Notion is extremely simple and requires three steps only. Open the import panel. After installing Notion, go to the left sidebar where you will find the import button. Clicking on it will show all the import options available. Authorize access to Evernote. Click on the Evernote option and authorize the connection between Notion and Evernote. You can also select how long you would like to authorize the connection, which can be handy if you would like to secure your API after importing your notes. Select the Evernote notebook you would like to import. After authorizing Notion to access your Evernote account, return to the Notion import screen. You can then select the Evernote notebooks you would like to import. There is also an option to select or deselect all notebooks. Once you are done selecting your notebooks, click on Import. Notion will then start importing your notes into a new database. The time will depend on how many notebooks you would like to import and how many notes are in each notebook. As you can see, migrating from Evernote to Notion is simple and easy. However, when switching apps, you should start with importing the notes that are important to you and import other notes later. This allows you to start using Notion immediately, as the migration process might take some time if you have many notes in the old app. Getting used to Notion Great, now we are done with the migration! Here are some small details to take note of when using Notion. Web clipper app. One of the best features of Evernote is the ability to save articles you read into your notebook with the mobile and desktop web clipper. However, Notion has its own mobile and desktop web clipper, which is as good as Evernote. If you are missing the web clipping option, you can easily use Notion’s web clipper to save important articles into your database. Use database views. Notion databases allow you to look at the same database differently. You can view the same database as a table, list, board, gallery, calendar, or timeline. This makes it easier to organize information and projects and allows you to give another perspective to your information. Unique features of Notion. While you may have not previously used them in Evernote, it’s worth exploring Notion’s unique features such as backlinks, embeds, and spreadsheets when writing your notes.  If you still cannot decide which one is right for you between using Evernote or Notion, try using both note-taking apps at once for a week and see which one feels more comfortable. Notion and Evernote do not necessarily have to be mutually exclusive, and some of your workflows might be more efficient if you combine the power of both apps. For instance, some notes may require features such as offline access, robust search functionality and annotating PDFs, while others may require you to switch database views and collaborate. Feel free to experiment with Evernote and Notion to see if you want to use both apps concurrently or decide that you only need to use one app as your tool for thought. And if you are interested in learning about Notion, join our community’s Notion support group. The post How to switch from Evernote to Notion appeared first on Ness Labs.
How to switch from Evernote to Notion
The paradoxical power of humility: how being humble is a strength
The paradoxical power of humility: how being humble is a strength
For too long, humility has been misunderstood. Despite traditionally being viewed as a weakness, psychologists now have a better understanding of the complex effects of humility. Far from being a weakness, being authentically humble has been found to offer many powerful benefits, including improved relationships at work, better team performance, and increased overall wellbeing. So how can you harness the power of humility? The benefits of humility Confusion around the definition of humility is common. Some may incorrectly assume that those who are humble have low self-esteem or a low opinion of themselves. This misinterpretation is not helped by the way we use the concept of humility in everyday language. For example, if you picture a “humble home”, you may imagine a house that is small or lacking in some way. However, when it comes to character traits, humility doesn’t describe a deficit. In fact, those who are humble typically have healthy self-esteem, but without the need to boast about it. A person who is humble is likely to be more courteous, respectful, and in tune with the feelings of others. They are also unlikely to be prone to bragging or arrogance. Scientists have found that being humble has numerous benefits. For instance, Dr Rob Nielsen and Dr Jennifer Marrone noted that in the workplace, “humble individuals acknowledge their limitations alongside their strengths, seek diverse feedback and appreciate contributions from others without experiencing significant ego threat”.  This awareness can lead to better relationships with others. Dr Daryl Van Tongeren and colleagues found that practising humility makes it more likely that an individual will be “other-orientated rather than self-focused”. By being humble, we are better able to celebrate the successes of others without feeling jealous or resentful of their progress. Once we are attuned to how others are feeling, and are able to respect their accomplishments, interpersonal relationships and networks become stronger. Forging such positive connections is of value in friendships, familial and romantic relationships, and for developing a good rapport with colleagues. The power of humble leadership According to Dr Jim Collins, old-fashioned board directors might believe that they need “an egocentric chief to lead the corporate change.” However, Collins argued that rather than superiority, humility is the most useful personality trait. He explains that effective leadership can only occur when decision makers tend to “give credit to others while assigning blame to themselves”. As part of his research, Collins also noted that humility is one of the top five most common characteristics in leadership roles. Dr JianChun Yang and colleagues further investigated why humility among leaders is important. They found that leaders who express humility are likely to see improved growth, development, and performance among their team members. Humble leaders have also been found to create work environments that are more psychologically safe. However, humility shouldn’t be used to fool people. A leader who appears to be humble only to serve management motives will be perceived as less trustworthy. Humility needs to be a genuine character trait, not an act. A humble life is a happy life It’s not only our personal and professional relationships that can benefit from humility. Researchers have found that those who are humble enjoy better physical and mental health, even when faced with stressful life events. For example, Dr Neal Krause and his team demonstrated that humility can help buffer the impact of a stressful event, helping to protect overall happiness and satisfaction with life, while also protecting from depression and anxiety. Practising humility can therefore help to boost wellbeing. Despite the benefits of humility across many facets of an individual’s life, humility remains largely unendorsed as a character trait. Experts in positive psychology have found it concerning that, despite their association with life satisfaction, humility and modesty are not more highly rated as character traits. Dr John Harvey and Dr Brian Pauwels noted: “It is difficult to understand why so many people at this point in their lives would not endorse modesty and humility as essential to life satisfaction.” (they are referring to mid-life) Humility is proven to be a powerful personality trait. It can boost work performance, improve the quality of our relationships, and even support mental wellbeing in the face of adversity. It’s therefore paradoxical that this personality trait is so woefully under-recognised. Strategies to practice humility It is clear from the research that humility can be hugely rewarding, not only for yourself, but for those in your personal and professional life as well. If you want to benefit from being more humble, the following strategies will help you strengthen this valuable personality trait.  Listen more and speak less. Practising active listening allows you to spend more time trying to understand someone else’s point of view. In doing so, you show that you are present, interested, and engaged in the conversation. When it’s your turn to speak, ask relevant questions and remain focused on the other person. Avoid making the conversation all about you. This ensures that your friend or colleague feels appreciated. Learning to truly listen will allow everyone in the conversation to thrive. Support others without bragging. Help people around you without bragging about it. If you’re truly helpful, people are likely to publicly thank you for your support, without the need for you to be the one highlighting all your effort. And, even if they don’t, helping someone else with a project is a good way to learn in and of itself — the recognition for your help is just the cherry on top. Celebrate the success of others. While it’s important to celebrate all your wins, even the small ones, humble people also rejoice in the success of others. When someone else begins a new challenge, support them without referring to your own achievements. If a friend accomplishes a goal, show them how impressed you are. Don’t relate their achievements to your own. Instead, let them feel proud without comparison. Being humble has commonly been viewed as a weakness, but modern research shows that practising humility can be a powerful tool for connecting with others both personally and professionally. While practising humility will help those around you to feel supported and valued, nurturing this trait can also boost your performance and protect you from low mood and anxiety.  However, it’s important to remember that any work towards your personal growth must be authentic. In order to benefit from humility, your desire to develop this personality trait must be genuine. To begin further develop your humility, become more “other-orientated” by practising active listening, being supportive of the people in your life, and whole-heartedly celebrating the success of others without comparing it to your own. The post The paradoxical power of humility: how being humble is a strength appeared first on Ness Labs.
The paradoxical power of humility: how being humble is a strength
July 2022 Updates
July 2022 Updates
New Things Under the Sun is a living literature review; as the state of the academic literature evolves, so do we. This post highlights some recent updates. An Internet of Ink and Paper The post “The Internet and Access to Distant Ideas” highlighted three studies from the early days of the US internet to illustrate how access to the internet facilitated innovation. Firms who are connected to each other by the internet are more likely to collaborate on patents or cite each other’s work, and counties that would normally be left behind by rising geographic concentration of patenting were better able to buck the trend if they enjoyed greater internet penetration. Thanks for reading What's New Under the Sun! Subscribe for free to receive new posts and support my work. This post has now been updated to include discussion of a new paper by Hanlon and coauthors, which documents the same kinds of effects for a very different change in the technology of long-distance communication: This isn’t the first time we’ve seen something like the dynamics brought about by the internet. Hanlon et al. (2022) travel even further back in time to 1840 in Great Britain to study what happens to science and invention when the price of the mail drops. Prior to 1840, the cost of posting a letter in Great Britain varied substantially based on the distance the letter needed to travel, as can be seen in the figure below. But in 1840, a greatly simplified pricing system was introduced: posting a domestic letter, of any distance, cost 1 penny. Modified from Hanlon et al. (2022) As with the preceding papers, Hanlon and coauthors want to know how this drop in the price of long-distance communication affected collaboration (in science this time) and invention. Though it may seem a bit niche to contemporary readers based outside the UK, as a natural experiment in the effects of communication, this setting has several virtues. In this era, pretty much the only way to communicate with people at a distance was by personal travel or via the postal system (telegrams at this time were primarily used by the railroads, not the general public). So if long-distance communication is important, this price change should matter. Because prices prior to reform were based on distance, we actually have a lot of variation to work with. Distant towns experienced a big price cut in the costs of communication and nearby towns experienced only a small price cut. We can look to see if the effects of the reform varied across those contexts. The price changes were substantial enough, by the standards of the day, to matter. The price of mailing a one-page letter from London to Edinbourgh fell from 10-20% of a professor’s daily salary to 0.5-1%! Also suggesting the price cuts were material, there was a very large increase in mail posted following the reforms. To track the impacts, Hanlon and coauthors do two analyses. The first is based on the citations made by articles published in the premier scientific journal of the day, the Philosophical Transactions of the Royal Society of London. For the ten years before and after the postal pricing reform, they locate where the scientists publishing in the Royal Transactions live and where the scientists they cite live. This gives them 1,251 citations between scientists in different parts of Great Britain. Analogously to Forman and Zeebroek (2019), they show the postal price cut increased citations between towns, and that this effect was larger for towns where correspondence was previously more expensive. Specifically, the price cuts reduced the “distance” penalty, wherein towns that are farther apart cite each other less, by 70%. Hanlon and coauthor’s second analysis tries to assess the impact of the reform on new patents. For this, they have to take a different approach, because even if a patent is drawing on distant knowledge (obtained through mail correspondence), this isn’t really visible in the patent document. Patent citations in this era was not a big thing, nor was collaboration at a distance. After locating where each inventor resides, Hanlon and coauthors try to estimate, for every town, how much did the postal reform affect that specific town’s access to ideas from the rest of Great Britain. By this measure, a town that is very remote from all others would experience a big increase in its access to distant ideas, since prior to the pricing reform it would have been quite expensive to correspond with most of the people in Great Britain. In contrast, a town that lies within a geographical cluster of several large population centers may have experienced a much smaller increase in its access to distant ideas. There are some other complicating details, but again they find the same flavor of result as earlier papers: patents increased by a larger amount in more remote towns, following the introduction of uniform postal pricing. So in two quite different settings we observe the same general phenomenon: when communication at a distance becomes easier, access to distant ideas is improved and this has a disproportionate benefit to places that are otherwise far from where the inventive action is. It didn’t make it into the update, but reading these history papers I am always impressed by the amount of work that has to go into creating the dataset. It’s no small thing to locate where each inventor lives in every year, which post office is closest, and how much it would cost to correspond with other post offices! You can read the rest of the article (now renamed “The internet, the postal service, and access to distant ideas”) including the pre-existing bits about the early internet, here: Read the whole thing Networking at Academic Conferences The post “Academic Conferences and Collaboration” surveyed a few papers that document how academic conferences can be useful for forging new collaborations. This post has been updated to include discussion of a new paper that tackles this question in a new way: Instead of comparing people who attend a conference to those that do not, you can also look within a conference and see if attendees who interact more often during the conference are more likely to collaborate on new projects. Two papers find that is also the case. Zajdela et al. (2022) examine four recent conferences (around 60 attendees), that mixed large topic discussions of around 10 people with small group discussions of 3-4 people. Zajdela and coauthors estimate how much time people spent interacting at the conferences based on their joint assignment to different sessions (they assume you might have interacted more if the session was longer or if the number of attendees was smaller). At the end of the conference they can see if people spontaneously teamed up to submit a proposal for research funding. Do people who spent more time in the same sessions team up at a greater rate? Yes! But that doesn’t tell us much unless we know how these groups were formed. Maybe the conference organizers tried to match people up who they thought were most likely to want to work together; and maybe these people would have identified each other no matter what, in a conference with just 60 attendees. In that case, time spent in sessions together doesn’t matter - these people would always have collaborated. Fortunately, Zajdela and coauthors also know the algorithm which was used to assign people to small and large group sessions. The conferences tried to optimally place people together according to some seemingly desirable, but possibly conflicting, rules.undefined Because this group assignment problem is very complex, the algorithm doesn’t exactly solve for the “best” outcome by these criteria. Instead, it just tries to get as close as it can, and there is a bit of randomness in where it ends up. Zajdela and coauthors re-run this algorithm a bunch of time to come up alternative conference schedules, each of which might well have been the actual schedule but for a bit of algorithmic luck. Then they look to see if collaboration is highly correlated with the actual time spent interacting, rather than the potential time interacting under alternative plausible conference schedules. And it is: among people who did not previously know each other, collaboration was about 9x more likely for pairs that actually attended a small group session together, as compared to pairs who did not attend a small group session together in the real world but would have in alternative possible conference schedules. The post is also updated with a paragraph discussing some results of Lane et al. (2019), which is a longer run follow-up of one of the other papers discussed in the original post (Lane et al. (2019) has also been covered in more detail here). Read the whole thing Responding to a Good Counterargument to a Recent Post The recent post “How common is independent discovery?” surveyed a few lines of evidence to think through how much redundancy there is in science and invention: if the discoverer of some idea had gotten sidetracked and never made the discovery, how likely is it someone else would have come along to make the discovery instead? An email correspondent responding to that post made a really good counterargument to my interpretation of the evidence. I thought a good response to the counterargument was possible, but it would require drawing on a few additional papers. However, since “How common is independent discovery?” was already about as long as I want posts on New Things Under the Sun to be, rather than adding more discussion to that post, I instead decided to split what used to be one long article into two shorter articles. So now there are two (interrelated) articles related to this topic. The original “How common is independent discovery?” has been reorganized and shortened to focus narrowly on papers about exactly what the title promises: independent discovery. Meanwhile a new post titled “Contingency and Science” is now ...
July 2022 Updates
Creating scalable knowledge spaces with Heiko Haller co-founder of Infinity Maps
Creating scalable knowledge spaces with Heiko Haller co-founder of Infinity Maps
Welcome to this edition of our Tools for Thought series, where we interview founders on a mission to help us think better and make the most of our mind. Heiko Haller is the co-founder of Infinity Maps, a tool for visual thinking and knowledge sharing that lets you create deep knowledge maps to manage information overload and collaborate on complex ideas. In this interview, we talked about the unique challenges faced by knowledge workers, the limitations of mind maps and concept maps, the importance of structure, hierarchy, interconnectedness, and scalability, the difference between “finding” and “reminding”, and much more. Enjoy the read!  Hi Heiko, can you start by introducing yourself and give a bit of background about your interests? I saw that you studied computer science and psychology, that sounds fascinating. I have always been passionate about creating brain-friendly learning media. I nearly studied media technology, but then I thought that I would learn the technical side by myself, so I studied psychology to better understand the brain and design brain-friendly knowledge media. I then widened my focus to thinking tools and visualization tools. I studied existing methods and had the idea of combining the benefits of all of these methods. That was the topic of my PhD in computer science, where I was able to make a proof of concept. I have also worked as a consultant for startups and automotive companies, which makes me one of these people who can bridge the cultural gaps between business, technology, and the human side of product design — I speak all three languages. Between your studies in psychology and computer science, and your work as a consultant, you probably understand quite well the challenges faced by knowledge workers. There is a great quote by Peter Drucker that goes: “The most important, and indeed the truly unique, contribution of management in the 20th century was the fifty-fold increase in the productivity of the manual worker in manufacturing. The most important contribution management needs to make in the 21st century is similarly to increase the productivity of knowledge work and the knowledge worker.” But he also adds that the methods to achieve these goals are totally different because manual work and knowledge work are completely different fields. To address the challenges of knowledge workers, like researchers, consultants or students, you need to have a much stronger focus on the psychology of work and on how the brain works. You need to ask yourself: what gets in the way of us thinking really complex or complicated topics? One of the most basic challenges is that our working memory is very limited — you can compare it to the RAM of a computer. It can only hold like a handful of thoughts at once. We can’t think about many things simultaneously, and that becomes a real problem when we deal with complex topics like we are faced with nowadays. Another major limitation is that we are often unable to recall certain things. The good news is that our long-term memory is practically unlimited. There’s no lack of space in our long-term memory. When we have forgotten something, it’s not because it’s out of our memory, it’s just that we can’t find a way to get there. Many knowledge workers have tried to solve these problems using mind maps or concept maps, what do you think about these? Mind maps have been proven to be really useful whenever we need to deal with more than a handful of things that we want to juggle, for example if you want to do a brainstorming session and then sort these ideas. When things get too many to grasp, we put them in boxes and we create hierarchies. If you think about geography, we have the solar system, the planets, continents, countries, counties, cities, blocks, houses, flats, rooms. That’s why we are so used to using folder hierarchies on our laptops. It’s natural for us to think in terms of things that are in other things. We never look at the whole scope. We always focus on a certain level. That’s one major thing that mind maps really cover in a nice way. What mind maps are bad at is to show how things interact, or what are the interconnections between topics — for example, this leads to that, and that contradicts that over here — because these relationships are simply not hierarchic and they don’t fit in a mind map. In contrast, concept maps are great for showing interrelations. They are graphs, they are networks. They have nodes and edges. They show not only how things belong together, but also how they interact. But, contrary to mind maps, they’re not so good at showing which is the central topic, where I should start reading to grasp an overview of the whole structure. Another limitation of both mind maps and concept maps is that they work well if you have something between ten and fifty items; but if you have a hundred or more, the metaphor starts to break. I have seen concept maps from NASA that were so complex, you would have to follow the arrows with your finger to see where it leads in the end, because they were just too big and had too many connections to actually just grasp the structure with your eyes. So again, they’re good for what they are good at, but they’re not good if it gets a little larger. What about traditional note-taking apps? Again, they’re quite good at what they are. They are made for capturing single notes and jotting something down quickly. One trick that they use is that usually you’re only looking at one note at a time, so you can have thousands or millions of notes in your note-taking base, and it can hold that easily. What’s missing with really most note-taking apps out there is the overview. You never see the big picture, how things are structured, what the clusters are, what leads to what. That’s one aspect that mind maps and concept maps do really well: giving you an overview. Most note-taking apps don’t cover that. What about some of the newest apps that feature a visual knowledge graph? I still think there is something essential that is missing there: The visualizations in those apps are not stable. They take the structure of your notes and they generate a nice visualization. But if you add another note, the visual layout changes. And then, you can’t find things where they were before. It means that you cannot use your spatial memory because things look different every time. And that’s something that I really like to tap into: using our spatial sense of orientation to get some of the cognitive load in our heads away, using a spatial system so we have more brain power left to think about the actual problem at hand. And that’s basically the iMapping method you have developed as part of your doctoral thesis, right? Yes, that’s what I have called iMapping. It is a way of visualizing information structures, that combines these core benefits of other approaches: hierarchy like mind maps, network structure, like concept maps, but also the simplicity of whiteboards. And as a fourth core benefit, it adds scalability in terms of holding really large amount of notes. The trick was to make the hierarchy not like in mind maps, where it starts in the middle and then it branches from the middle out, but to turn that inside out and to make the hierarchy going from the outside in — which, as I said, is basically like the world is structured, with planets, continents, and so on. You have boxes that are nested inside other boxes, and you can use lines for the connections, but usually I don’t show them. Many of the visualization tools we have nowadays are still based on the pen and paper metaphor. They don’t make use of all of the advantages of interactive computer systems. With mind maps and concept maps, it can become really overwhelming to see all these arrows at the same time, especially if you have large complex interconnected graphs. It’s like spaghetti. So what we do with iMapping is that we only show connections on demand, in an interactive way, only when you touch something. And with that nesting of boxes inside boxes or cards inside cards, you can make maps as large as you want, because you never run out of space.  Another reason why we don’t make these connections front and center is because cognitive tools should make it very low effort to jot something down without thinking about the structure and the spatial mapping, without thinking about what this is connected to and where exactly it should go. Similar to a whiteboard where you can just write or add a sticky note wherever there’s some space where — you just put it somewhere before you know what’s the right place for it — with this kind of canvas space, you can just write or throw things wherever you want and then structure it later. So iMapping combines the advantages of mind maps, concept maps, and whiteboard, without any of the drawbacks. How did you translate this theoretical framework into Infinity Maps, which is an actual working tool? It was a long journey. My master’s thesis consisted in comparing all these visual approaches. I then decided to pursue a PhD to explore the idea of combining these approaches together. Doing a PhD seemed like a great way to have access to an environment where I would get some time and money to evolve the idea, and I could work with computer science students who would implement it. The first prototype was simply called the iMapping tool. It was good enough to be actually used, so I started putting my notes from my dissertation in the tool and map it all out. It was such a boost in productivity. I could get all these thoughts sorted and structured, and see what is done, what is connected, what is missing, what the next steps are. If you deal with a certain topic for say, longer than a day or a week, things tend to get so large that it’s really helpful to have everything in one overview. After that, there was a research prototype which was free out there on the web. A magazine wrote about it and abou...
Creating scalable knowledge spaces with Heiko Haller co-founder of Infinity Maps
Creative Problem Solving: from complex challenge to innovative solution
Creative Problem Solving: from complex challenge to innovative solution
Even if you usually excel at finding solutions, there will be times when it seems that there’s no obvious answer to a problem. It could be that you’re facing a unique challenge that you’ve never needed to overcome before. You could feel overwhelmed because of a new context in which everything seems to be foreign, or you may feel like you’re lacking the skills or tools to navigate the situation. When facing a difficult dilemma, Creative Problem Solving offers a structured method to help you find an innovative and effective solution. The history of Creative Problem Solving The technique of Creative Problem Solving was first formulated by Alex Osborn in the 1940’s. It was not the first time Osborn came up with a formula to support creative thinking. As a prolific creative theorist, Osborn also coined the term brainstorming to define the proactive process of generating new ideas. With brainstorming, Osborn suggested that it’s better to bring every idea you have to the table, including the wildest ones, because with just a little modification, the outrageous ideas may later become the most plausible solutions. In his own words: “It is easier to tone down a wild idea than to think up a new one.” Osborn worked closely with Sid Parnes, who was at the time the world’s leading expert on creativity and innovation. Together, they developed the Osborn-Parnes Creative Problem Solving Process. To this day, this process remains an effective way to generate solutions that break free from the status quo. The Creative Problem Solving process, sometimes referred to as CPS, is a proven way to approach a challenge more imaginatively. By redefining problems or opportunities, it becomes possible to move in a completely new and more innovative direction. Dr Donald Treffinger described Creative Problem Solving as an effective way to review problems, formulate opportunities, and generate varied and novel options leading to a new solution or course of action. As such, Treffinger argued that creative problem solving provides a “powerful set of tools for productive thinking”. Creative Problem Solving can also enhance collective learning at the organisational level. Dr David Vernon and colleagues found that Creative Problem Solving can support the design of more effective training programmes. From its invention by two creative theorists to its application at all levels of creative thinking — from personal to organisation creativity — Creative Problem Solving is an enduring method to generate innovative solutions to complex challenges. The four principles of Creative Problem Solving You can use Creative Problem Solving on your own or as part of a team. However, when adopted by multiple team members, it can lead to an even greater output of useful, original solutions. So, how do you put it into practice? First, you need to understand the four guiding principles behind Creative Problem Solving. The first principle is to look at problems and reframe them into questions. While problem statements tend to not generate many responses, open questions can lead to a wealth of insights, perspectives, and helpful information — which in turn make it easier to feel inspired and to come up with potential solutions. Instead of saying “this is the problem”, ask yourself: “Why are we facing this problem? What’s currently preventing us from solving this problem? What could be some potential solutions?” The second principle is to balance divergent and convergent thinking. During divergent thinking, all options are entertained. Throw all ideas into the ring, regardless of how far-fetched they might be. This is sometimes referred to as non-judgmental, non-rational divergent thinking. It’s based on the willingness to consider all new ideas. Convergent thinking, in contrast, is the thinking mode used to narrow down all of the possible ideas into a sensible shortlist. Balancing divergent and convergent thinking creates a steady state of creativity in which new ideas can be assessed and appraised to search for unique solutions. Tangential to the second principle, the third principle for creative problem solving is to defer judgement. By judging solutions too early, you will risk shutting down idea generation. Take your time during the divergent thinking phase to give your mind the freedom to dream ambitious ideas. Only when engaged in convergent thinking should you start judging the ideas you generated in terms of potential, appropriateness, and feasibility. Finally, Creative Problem Solving requires you to say “yes, and” rather than “no, but” in order to encourage generative discussions. You will only stifle your creativity by automatically saying no to ideas that seem illogical or unfeasible. Using positive language allows you to explore possibilities, leaving space for the seeds of ideas to grow into applicable solutions. How to practice Creative Problem Solving Now that you know the principles underlying Creative Problem Solving, you’re ready to start implementing the practical method devised by its inventors. And the good news is that you’ll only need to follow three simple steps. Generating – Formulate questions. The first step is to understand what the problem is. By turning the problem into a set of questions, you can explore the issue properly and fully grasp the situation, obstacles, and opportunities. This is also the time to gather facts and the opinions of others, if relevant to the problem at hand. Conceptualising – Explore ideas. The second step is when you can express your creativity through divergent thinking. Brainstorm new, wild and off-the-wall ideas to generate new concepts that could be the key to solving your dilemma. This can be done on your own, or as part of a brainstorming session with your team. Optimising – Develop solutions. Now is the time to switch to convergent thinking. Reflect on the ideas you came up with in step two to decide which ones could be successful. As part of optimising, you will need to decide which options might best fit your needs and logistical constraints, how you can make your concepts stronger, and finally decide which idea to move forwards with. Implementing – Formulate a plan. Figuring out how you’ll turn the selected idea into reality is the final step after deciding which of your ideas offers the best solution. Identify what you’ll need to get started, and, if appropriate, let others know of your plans. Communication is particularly important for innovative ideas that require buy-in from others, especially if you think you might initially be met with resistance. You may also need to consider whether you’ll need additional resources to ensure the success of complex solutions, and request the required support in good time. Creative Problem Solving is a great way to generate unique ideas when there appears to be no obvious solution to a problem. If you’re feeling overwhelmed by a seemingly impossible challenge, this structured approach will help you generate solutions that you might otherwise not have considered. By practising Creative Problem Solving, some of the most improbable ideas could lead to the discovery of the perfect solution. The post Creative Problem Solving: from complex challenge to innovative solution appeared first on Ness Labs.
Creative Problem Solving: from complex challenge to innovative solution
Do Academic Citations Measure the Impact of New Ideas?
Do Academic Citations Measure the Impact of New Ideas?
Like the rest of New Things Under the Sun, this article will be updated as the state of the academic literature evolves; you can read the latest version here. Audio versions of this article will be available in a few days (I’m traveling). A huge quantity of academic research that seeks to understand how science works relies on citation counts to measure the value of knowledge created by scientists. The basic idea is to get around serious challenges in evaluating something as nebulous as knowledge by leverage two norms in science: New discoveries are written up and published in academic journals Journal articles acknowledge related work by citing it If that story works, then if your idea is influential, most of the subsequent knowledge it influences will eventually find its way into journal articles and then those articles will cite you. By counting the number of journal articles citing the idea, we have a rough-and-ready measure of it’s impact.1 This measure of scientific impact is so deeply embedded in the literature, that its absolutely crucial to know if it’s reliable. So today I want to look at a few recent articles that look into this foundational question: are citation counts a good measure of the value of scientific contributions? What is Value? Citations as a measure of academic value is a sensitive topic, so before jumping in, it’s important to clarify what value means in this context. There are at least two things that are bundled together. First, there is what we might call the potential value of some discovery. Did the discovery uncover something true (or directionally true) about the universe that we didn’t know? If widely known, how much would affect what people believe and do? How would it be assessed by an impartial observer with the relevant knowledge set? Second, there is the actual impact of the discovery on the world out there. Did the discovery actually affect what people believe and do? More specifically, did it affect the kinds of research later scientists chose to do? If science is working well, then we would hope the only difference between these two is time. Optimistically, good ideas get recognized, people learn about them, incorporate the insights into their own research trajectories, and then cite them. In that case, potential value is basically the same thing as the actual impact if you let enough time pass. But we have a lot of reasons not to be optimistic. Maybe important new ideas face barriers to getting published and disseminated, because of conservatism in science, or because of bias and discrimination. Or, if those obstacles can be surmounted, maybe there are barriers to changing research directions that prevent scientists from following up on the most interesting new ideas and allowing them to reach their potential. Or maybe low potential ideas garner all the attention because the discoverers are influential in the field. In that case, citations still reflect actual impact, in the sense that they really do capture how ideas affect what people believe and do. But in this less optimistic scenario, impact and potential value have been partially or completely decoupled, because science isn’t very good at translating potential value into realized value. It lets good ideas go to waste and showers disproportionate attention on bad ones. But it’s also possible that citations don’t even reflect actual impact. This would be the case, for example, if citations don’t really reflect acknowledgements of intellectual influence. Maybe people don’t read the stuff they cite; maybe they feel pressured to add citations to curry favor with the right gatekeepers; maybe they just add meaningless citations to make their ideas seem important; maybe they try to pass off other people’s ideas as their own, without citation. If these practices are widespread, then citations may not reflect much of anything at all. I’m going to end up arguing that citations are reasonably well correlated with actual impact, and science works well enough that actual impact is also correlated with potential impact. That’s not to say there are no problems with how science works - I think there are plenty - but the system isn’t hopelessly broken. Finally, bear in mind that my goal here is mostly to assess how useful are citation counts in the context of academic papers that study how science functions. That’s a context where we typically have a lot of data: always hundreds of papers, and usually thousands. With a lot of observations, even a small signal can emerge from a lot of noise. In contexts with many fewer observations though, we shouldn’t be nearly so confident that citations are so valuable. If you are trying to assess the contribution of a single paper, or a single person, you shouldn’t assume citation counts are enough. To get a better sense of value in this context, unfortunately you probably have to have someone with the relevant knowledge base actually read the paper(s). OK, onward to what we find when we look into these questions. Why do people cite papers? The whole point of tracking citations is the assumption that people acknowledge the influence of previous discoveries by citing the relevant papers. Is that really what citations do though? Teplitsky et al. (2022) tries to answer this question (and others) by asking researchers about why they cited specific papers. In a 2018 survey, they get responses from a bit over 9,000 academics from 15 different fields on over 17,000 citations made. Surveys are personalized, so that each respondent is asked about two citations that they made in one of their own papers. Teplitsky and coauthors construct their sample of citations so that they have data on citations to papers published in multiple years, and which span the whole range of citation counts, from barely cited to the top 1% most cited in the field and year. Among other things, the survey asks respondents “how much did this reference influence the research choices in your paper?”, with possible answers ranging from “very minor influence (paper would’ve been very similar without this reference)” to “very major influence (motivated the entire project).” Assessed this way, most citations do not reflect significant influence. Overall, 54% of citations had either a minor or very minor influence. Given the way these options were explained to respondents, that’s consistent with most citations affecting something less than a single sentence in a paper. Only about 18% of citations reflect major or very major influence (for example, they influenced the choice of theory, method, or the whole research topic). That implies citations are a very noisy way of measuring influence. But there’s an interesting twist. It turns out the probability a paper is influential is not random: more highly cited papers are also more likely to be rated as major or very major influences. From Teplitsky et al. (2022) Notice the overall data line says “with citer effects.” That’s intended to control for the possibility that there might be systematic differences among respondents. Maybe the kind of researchers who lazily cite top-cited work are also the kind of people who lazily answer surveys and just say “sure, major influence.” But Teplitsky and coauthors survey is cleverly designed so they can separate out any potential differences among the kind of people who cite highly cited work versus those who do not: they can look at the probability the same person rates a paper as more influential than another if it also has more citations. Overall, when you additionally try to control for other features of papers, so that you are comparing papers papers the survey respondent knows equally well (or poorly), the probability they will rate a paper as influential goes up by 34 percentage points for every 10-fold increase in citations. So I take a few things from this survey. First, there is a ton of noise in citation data; just as not all papers are equal, so too are not all citations equal. A paper with 5 citations is quite plausibly more influential than one with 10. But all else equal, there is a pretty strong relationship between the number of citations a paper gets and how influential it is. This measure is subject to a lot of noise, but among very highly cited papers, the relationship between citations and influence is actually stronger than it is for less cited papers. Not only is a paper with 1000 citations more likely to be influential than one with 500 simply because it has so many more chances to be influential, but additionally because each of those chances has a higher probability of being influential. Uncited Influences Note, however, that Teplitsky and coauthors start with citations: they observe a citation that was made and ask the citer why they made it. But that design means it’s impossible to identify work that is influential but uncited. Fortunately, new natural language processing techniques allow us to start answering that as well. Gerrish and Blei (2010) propose a new method to measure the influence of academic papers by looking at how much they change the language of papers that come later. They then show that, indeed, if you try to identify influential papers merely based on the relationships between their text and the text of later articles, there is a reasonably strong correlation between language influence and citations. Gerrish and Blei start with topic models. These are a way of modeling the words used in a paper as the outcome of a blind statistical process. We pretend there are these things out there called “topics” which are like bags of words, where you reach into the bag and pull out a word. Different topics have different mixes of words and different probabilities of grabbing specific words. Next, we pretend papers are nothing more than a bunch of words we grab out of different topic bags. As an example, if I’m writing a paper on the impact of remote work on innovation, then maybe half ...
Do Academic Citations Measure the Impact of New Ideas?
Unshackling knowledge management with Samiur Rahman co-founder of Heyday
Unshackling knowledge management with Samiur Rahman co-founder of Heyday
Welcome to this edition of our Tools for Thought series, where we interview founders on a mission to help us be more creative and more productive without sacrificing our mental health. Samiur Rahman is the co-founder of Heyday, a smart browser extension that automatically saves web pages you visit and content from your apps, and uses AI to resurface that content when it’s most useful. In this interview, we talked about the importance of curating a knowledge base, the bottleneck preventing our best ideas from growing, why we need to break the silos of traditional knowledge management, how AI can help people become more knowledgeable, and more. Enjoy the read! Hi Samiur, thank you so much for agreeing to this interview. What do you think is wrong with most knowledge management tools? Thanks for having me! Our brains were not built to handle the volume of information on the internet. Keeping more than a hundred browser tabs open is just one way people try to stay on top of their information and keep themselves from feeling like they are falling behind. We also dump links in Google Docs and text ourselves content to remember. But it’s impossible to keep up. Today’s knowledge management tools try to help, but only if we change our workflows and update them constantly. They are built primarily for super users who like organizing information, are happy to watch hour-long setup videos, and enjoy spending time inside of their tools. People like me, who aren’t productivity junkies, opt-out. Most people who do a lot of research online don’t want to spend time organizing their information. They just want to be smarter. Is that what inspired you to create Heyday? About a year ago, my co-founder Sam DeBrule and I shut down our previous startup. We had built a knowledge management app that was undifferentiated from Notion and other great apps in the knowledge management space. After years of work, most users who tried it didn’t stick around. But in the process of building it, we stumbled upon a group of users who felt underserved by popular knowledge management tools. They told us they wanted to be more organized, but didn’t have the time, energy, or patience to set up tools and keep updating them. As someone who has ADHD, I have felt the same way for most of my career. And as a Machine Learning engineer, I was drawn toward using automation to free people up from repetitive, uncreative tasks. My co-founder Sam and I started Heyday to help people handle the high volume of information on the internet with less effort. Heyday is fast and easy to get started, layers on top of existing workflows, and requires little manual input. So, how does it work? Heyday’s browser extension automatically resurfaces content that people forgot about. It saves web pages and articles people visit and pulls in documents, links, files, and messages from their apps via integrations. Then it uses AI to resurface that content alongside their existing research. Heyday helps customers learn faster, save time, and feel smarter by surfacing content from past research alongside relevant Google search results to improve recall and prevent them from wasting time seeking information, overlaying articles they are reading with relevant Tweets, articles, and documents to fill gaps in their research and help them understand new topics faster, and curating a knowledge base that fills itself with content related to topics they’ve spent the most time researching to improve retention. Automatically building a knowledge base sounds great, but a lot of people struggle to actually learn from the information they save — how exactly does Heyday help with that? Heyday doesn’t just automatically save information. Our AI curates a list of topics based on what you spend the most time reading to resurface things you care about — to improve retention with a knowledge base that fills itself with quality content. Heyday’s browser extension also overlays pages you’re reading with relevant articles, documents, and conversations from Twitter that link back to them. By showing topical content, Heyday fills gaps in your knowledge and helps you learn about new topics faster. You also made it a point to break the silos of traditional knowledge management. Yes. People need to refer back to content, but then they can’t remember which silo it’s in — Notion, Google Docs, Dropbox, Slack, or maybe they saw it on Twitter. We waste twenty minutes looking for something, but never figure out which app it’s in. As a result, our best ideas don’t develop. Heyday pulls in links, documents, and conversations from Notion, Google Docs, Dropbox, Discord, Slack, and Twitter, to bring the different pieces of your web browsing and knowledge together in a cohesive whole. We then layer relevant content from those places on top of your reading and research, whether you’re doing it in your desktop or mobile browser. Our vision is to become an AI-powered research assistant that works in the background to make people more knowledgeable. We want to be there, conveniently adding context to any interface people use. We are able to add the richest context when we dissolve the barriers standing between people and their content. Heyday sounds so powerful, so we got to ask: what about privacy? Our business model is such that our only incentive is to protect users’ privacy. After a trial, customers pay a subscription for the value we create. We don’t sell users data. We don’t do ads. We encrypt all their data to keep it safe and so that the customer is the only person who sees it. We store our customers’ data for as long as they have an active subscription to improve our product experience. When a customer decides not to renew their subscription, their data is deleted from our system. That’s great to hear. What kind of people use Heyday? People who do a lot of reading and research online, like content marketers, founders, and investors, are among Heyday’s customer base. Here are some reviews from our customers. According to Jan-Erik Asplund, co-founder of Sacra: “Heyday saves me a bunch of time and mental energy during my research process by helping me remember content and work from the past I’ve totally forgotten.” Daniel Zarick, CEO of Arrows, said: “My brain can’t be bothered to collect and organize information as I consume it. It’s just not wired that way. Heyday has been amazing at being my AI assistant, never letting me forget or lose track of information I’ve already checked out.” And from Kyle O’Brien, founder of Startup ROI and operating partner at Revaia: “Heyday is a great product for creators and knowledge workers alike. It basically functions as an extension of my memory that’s searchable, shareable, and easily manipulated. With the amount of tabs I have open at any given time, it’s a shock I can recall anything. Heyday is instrumental in turning ‘browsing entropy’ into useful, organized information for later use.” What about you… How do you personally use Heyday? I have a set of topics that I’m always reading about, like Machine Learning. Remembering to save every interesting article/paper on those topics is tedious for me. With Heyday, I review the dashboard for curation suggestions once every few days to be reminded about interesting things I’ve read about the topics I care about. When I’m reading online articles, Heyday’s browser overlay shows me tweets about the article from people I follow. When I’m viewing Google Docs, it shows me other Google Docs that link back to it. This makes me aware of related things I might want to look at to get a fuller picture much more quickly. I’m always trying to remember that interesting thing that I read months or years ago relevant to a conversation I’m having right now, and Heyday is magic for that use case. I always seem smarter than I actually am because I can always find the article, Tweet thread or PDF I’m vaguely remembering to quickly share with a friend. These are great use cases. And finally… What’s next for Heyday? Today, Heyday is a single-player product. In the near future, we will build a multiplayer product experience for teams. Our customers are excited about us building a “hivemind” collaborative research experience that enables ambient learning amongst teammates. By pooling and surfacing content, Heyday will help organizations build an understanding of emergent topics faster and prevent teams from recreating work that already exists. Thank you so much for your time, Samiur! Where can people learn more about Heyday and give it a try? To add Heyday to your browser, please visit our website. You can also follow me and my co-founder for tips at @samdebrule and @samiur1204. The post Unshackling knowledge management with Samiur Rahman, co-founder of Heyday appeared first on Ness Labs.
Unshackling knowledge management with Samiur Rahman co-founder of Heyday
How to use strength-based journaling for self-esteem and resilience
How to use strength-based journaling for self-esteem and resilience
Have you ever noticed that it is far easier to dwell on mistakes than it is to focus on the things that went well? For instance, you might become fixated on the three seconds in which you tripped over your words during a presentation, rather than acknowledging the remaining twenty minutes in which your performance was flawless. This tendency to dwell on our shortcomings can negatively impact our mental health, leading to low self-esteem, and even to anxiety and depression. However, simple, accessible tools such as strength-based journaling are available to help us build resilience. The science of strength-based journaling As adopting psychological techniques that help us to focus on character strengths has been shown to increase overall well-being, positive psychology interventions based on strength-based therapy have been of great interest to psychologists in recent years. In essence, strength-based approaches help individuals focus on what is already working. Researchers Christopher Peterson and Martin Seligman argued that out of 24 possible character strengths, individuals possess three to seven strengths that best describe them. These strengths are believed to be malleable, meaning that, once identified, they can be used for strength-based exercises that target overall well-being. However, just knowing what your strengths are is not enough. Indeed, Seligman later discovered that when participants were asked to use each of their so-called “signature strengths” in a new way each day for one week, they reported greater happiness and reduced depression at various intervals until the end of the 6-month study. But simply identifying one’s strengths had little effect on happiness, depression, or overall well-being. This suggests that to have the greatest impact on our mental health, the identification of strengths should be combined with a tool to express, reflect on, and develop those qualities. In line with these earlier findings, René Proyer and colleagues found that working on signature strengths was beneficial for our well-being. As part of the study, participants could select from 24 character strengths including creativity, curiosity, bravery, kindness or honesty. Compared to a placebo group, those who focused on their strengths demonstrated increased happiness, greater satisfaction with their health, and improved perception of their general living conditions. Strength spotting, which can be done just by using a journal, helps us shift from being preoccupied with repairing the worst parts of ourselves to instead building on the best parts already within us. By allowing us to make the most of our strengths, it can help unlock the opportunities already within us. Developing this confidence may lead to resilience and greater self-esteem, an important asset when life feels challenging. Getting started with strength-based journaling Strength-based therapy can be costly and time-consuming, but strength-based journaling provides an extremely convenient way to experience the benefits of positive psychology from the comfort of your own home. It’s free and you can get started at any time. There are no special tools required. Simply choose a notebook, get a fresh sheet of paper, or use your favourite note-taking app. Staring at a blank page while hoping for inspiration can be daunting. For this reason, you may find it helpful to use a list of strength-based prompts to get you started and help you to quickly discover more about your individual talents. Prompts might include: If you asked your closest friend about your best qualities, what would they say? Describe the last time your manager praised your work. Write down in detail your proudest accomplishment. What do you appreciate most about your personality? How do you maximise your personal strengths at work? Try to think of ten prompts spanning both your professional and personal areas of strength. You may find it helpful to include prompts that cover your career, friendships and relationships, your values, your personality traits, and how you handle difficult feelings such as guilt or regret. By using a wide range of prompts, you can create a detailed record of your strengths bridging the different facets of your life. Once you have decided on your prompts, begin to elaborate on each one. Your answers might take the form of bullet points, or you may prefer to write in long-form prose to describe your strengths in detail.  There is no pressure to answer every prompt in one sitting. Research suggests that writing about your positive experiences three times weekly for just 15 minutes leads to improved well-being, decreased anxiety, and more resilience. In short, apply these three easy steps to start applying strength-based journaling: Craft a curated list of strength-based journaling prompts to choose from. Block recurring sessions in your calendar, which can combine strength-based journaling with other self-reflection exercises, such as your weekly review. Write down your answers for as little as 15 minutes. Strength-based journaling is a free, convenient, and accessible positive psychology tool. This mindfulness method helps us to shift from dwelling on negative thoughts and beliefs to focusing on our strengths and capabilities. Once you have developed your list of prompts, you can build the habit of regularly dedicating a few minutes to exploring your strengths. By focusing on the positives in your life and your character, you will boost your self-esteem and resilience, making it easier to manage your mental health. The post How to use strength-based journaling for self-esteem and resilience appeared first on Ness Labs.
How to use strength-based journaling for self-esteem and resilience
Present bias: how instant gratification impacts your long-term goals
Present bias: how instant gratification impacts your long-term goals
How many times have you heard the phrases “live for the moment”, “you only live once”, or “seize the day”? This advice may sound great for adding some spontaneity to your life, but seizing short-term opportunities can lead you to settle for a small present reward rather than wait for a larger future reward. This tendency is known as the present bias. It may feel good at the time, but the present bias can negatively impact long-term planning, decision making and productivity. The neuroscience of the present bias The classic experiment that is used to illustrate whether the present bias is at play is to ask, “would you prefer $100 today, or $110 in one week?” The desire for instant reward will see many taking $100 immediately, rather than choosing to await the better, but delayed reward. Preferring smaller instant rewards rather than waiting for larger future ones is known as being present-biased. The present bias is not a new phenomenon. Notions of the pursuit of instant gratification date back to the ancient Greeks, and modern psychological research into this cognitive bias began in the 1930s. But it was not until 1968 that the term “present bias” was coined by Edmund Phelps and Robert Pollak. Their model of hyperbolic discounting described the tendency many people have to discount larger, future rewards in favour of smaller, immediate ones. A study was designed to show neurological responses within the brain when participants were asked to make decisions for other people. The research team, led by Dr Konstanze Albrecht, found that there was a much stronger activation in reward-related areas of the brain when participants were asked to make decisions that would lead to an immediate reward. Conversely, if the reward would not be experienced for some time, and therefore represented delayed gratification, there was far less activation within the reward centres of the brain. Similar patterns were also observed in the emotion centres of the brain. Various scientific studies have investigated the contributions of different areas of the brain involved in the present bias. For instance, Dr Marc Wittmann and colleagues asked individuals to make a hypothetical choice between an immediate reward and a reward that would be delayed for at least a year. The researchers discovered that when participants indicated that they would rather wait for the delayed future reward, strong activation was seen within the posterior insular cortex, a structure that lies beneath the outer folds of the brain. In contrast, decisions associated with an immediate reward led to activation within the corpus striatum, a component of the reward system that lies deep within the brain. Wittmann and colleagues concluded that those who have a strong neurological activation within the corpus striatum may be present-biased, leading to a desire to seek instant gratification and difficulty waiting for delayed rewards — even if being patient could lead to greater overall success. How the present bias impacts your success It’s tempting to take a reward that is available right now, but this cognitive bias can negatively impact your future self. For example, you may work extremely hard to get a place on a law degree course, but then give in to the present bias and spend most of your time socialising with your new friends at the expense of your exams. Being offered a higher salary can cause us to forget about our long-term career goals, while being patient and waiting a little longer might lead to the opportunity to create your dream role and potentially an even larger salary. Impulsivity is strongly associated with the present bias, and the desire for instant gratification can lead to reckless spending. Rather than putting money aside for a long-term investment, we might choose to spend that money on “feel-good” purchases. Booking flights, excursions, tours or theatre tickets can be triggered by a need to escape the treadmill of daily life. The present bias can also lead to procrastination. Rather than making progress on a project well before the deadline, you might fall prey to a more instant reward, such as watching a TV series. This activity feels good at the time but leads to additional stress when you then have to rush to complete the work. Thankfully, there are practical ways to manage the present bias and avoid its negative impact on decision making, long-term planning, and productivity. Strategies to manage the present bias The most important aspect of managing the present bias consists in staying mindful of your long-term goals so you don’t fall prey to the dangers of instant gratification. Here are three simple strategies you can use to keep your goals in mind and be careful when faced with instant rewards: Write down your long-term goals. You can do this exercise on paper or in your note-taking tool of choice. Create a list of all your long-term goals, whether there are things you want to achieve this year or over the course of the next few years. When faced with a choice between different rewards, refer back to your list to ensure your goals are not overshadowed by the lure of instant gratification. Delay new purchases until tomorrow. Did you find a shiny new toy you feel like you need to buy? If you’re in a physical store, snap a photo or the product; or copy-paste the link to your note-taking tool if you’re in an online store. Leave it until the day after to decide whether you still want to make the purchase. The extra friction will lower your impulsivity levels and allow time to make an informed decision. Follow the Ten Minute Rule. If you need to complete a task but find yourself procrastinating, tell yourself you will start it right now for just ten minutes. Once you get started, you will probably find yourself in a good state of flow, which means you may work at it for much longer than ten minutes. The Ten Minute Rule will reduce the impact of the present bias on your productivity. The present bias can impact our long-term plans, cause reckless decision making, and reduce our productivity. Our desire for instant gratification can be managed by keeping your medium to long-term goals in mind and trying to delay impulsive decisions. By making future-focused choices now, you can look forward to reaping the benefits in the months and years to come. The post Present bias: how instant gratification impacts your long-term goals appeared first on Ness Labs.
Present bias: how instant gratification impacts your long-term goals
Cognitive bottlenecks: the inherent limits of the thinking mind
Cognitive bottlenecks: the inherent limits of the thinking mind
The “thinking mind” is the part of the mind that seeks to make sense of the world; it analyses situations, imagines scenarios, evaluates solutions, and tells stories. It’s an inherent aspect of what makes us human. However, it’s limited by multiple cognitive bottlenecks. Why does it matter? Because these cognitive bottlenecks limit how much information we can process at one time, how many tasks we can simultaneously focus on, and how many parameters we can consider while making a decision. Intrinsic limitations of the thinking mind The human mind has many limitations. For instance, our limited sensory capabilities mean that there are many sources of information we cannot perceive at all. Dogs can smell emotions such as fear and ​​can tell the difference between two people based on their scent alone. Bees can see infrared, which is invisible to the human eye. For the most part, we are naturally aware of these sensory limitations. We know we are not able to perform echolocation or to see in the dark. But, somehow, we tend to overestimate our cognitive capacities — our ability to concurrently process multiple streams of information or to work on several tasks at the same time. So we multitask: we respond to emails while we listen to a presentation, we monitor social media channels while creating new content, we finish typing up a report while responding to a colleague’s questions. We believe that by combining two tasks, we will complete them sooner than if we worked on them separately. We also trust that we can consider many different facts when making complex decisions. But our thinking mind is limited by two Big Bad Bottlenecks: our attention and our working memory. Using brain scanning and behavioural experiments, researchers at the Center for Integrative and Cognitive Neurosciences at Vanderbilt University have identified a unified attentional bottleneck in the human brain, which impacts both perception and action. Simply put, we’re bad at dividing our attention between different tasks. The research team explains: “What the present results point to is the severe capacity limit of this adaptive coding system in implementing more than one task set at a time, thereby impeding our ability to consciously perceive, and appropriately respond to, successive events in the world.” Working memory is the second major cognitive bottleneck that limits our thinking mind. It allows us to retain multiple pieces of information for short-term processing. It’s what’s involved when you’re trying to keep a number in mind while solving an equation or when you’re holding onto multiple concepts so you can connect them together in your head, and it’s important for activities such as reading, writing, having a conversation, and making decisions. The problem is that working memory is extremely limited in capacity and duration, which can impact learning and decision-making. In the words of Dr Bill Cerbin, professor of psychology and director of the Center for Advancing Teaching and Learning, explains: “Humans are endowed with remarkable cognitive capacities but one area where we are seriously limited is working memory.” “Working memory is the mental space where we do conscious, active thinking — and that space has limited capacity. (…) A fundamental problem in learning is that working memory is a bottleneck — everything new that we learn has to go through working memory before we can commit it to permanent or long term memory,” he adds. Everyone will have different profiles for levels of attention and working memory, and your cognitive capacities also vary throughout the day and throughout the years. Being aware of the existence of these cognitive bottlenecks can help you avoid being overconfident in your cognitive capacities, and to make more sound decisions at work and in your daily life. How to manage your cognitive bottlenecks You’d think that with sufficient training you may be able to overcome these cognitive bottlenecks. However, research suggests that such interventions don’t yield long-term improvements. So what exactly can you do to deal with these limitations? Here are three strategies that don’t rely on specific training and that you can start applying straight away: Offload some of your thinking. Instead of relying on your mind alone for information processing and decision making, use tools for thought that help you collect and connect ideas together. Start taking notes and applying mental models to navigate complex decisions. Maintain a mind garden to track your thoughts. You can even find a thinking buddy or join a community to discuss your projects, ask questions, share your doubts, and gather more insights. Plan for focused chunks of work. Instead of trying to multitask, define clear tasks and block time in your calendar to complete them. Close all other apps, put your phone in another room, and make sure people around you know that you are in focus mode — for example by closing the door or by wearing your headphones. Practice mindful productivity. Instead of blaming yourself anytime you notice you’re distracted, gently bring back your attention to the task at hand. If it keeps on happening, simply take a short break to recharge your mental batteries. Calmly acknowledge and accept your feelings and thoughts while engaged in work or creative activities. Create a metacognition practice for yourself, such as journaling or a weekly review. Once we get rid of the illusionary multitasking and the toxic productivity, cognitive bottlenecks are not inherently bad. They are just characteristics of our mind we need to consider when we plan our work and interact with the world. Instead of investing in expensive brain training apps, apply the above simple cognitive management strategies to unshackle your productivity without creating unnecessary stress. The post Cognitive bottlenecks: the inherent limits of the thinking mind appeared first on Ness Labs.
Cognitive bottlenecks: the inherent limits of the thinking mind
How common is independent discovery?
How common is independent discovery?
Like the rest of New Things Under the Sun, this article will be updated as the state of the academic literature evolves; you can read the latest version here. Audio versions of this and other posts: Substack, Apple, Spotify, Google, Amazon, Stitcher. An old divide in the study of innovation is whether ideas come primarily from individual/group creativity, or whether they are “in the air”, so that anyone with the right set of background knowledge will be able to see them. As evidence of the latter, people have pointed to prominent examples of multiple simultaneous discovery: Isaac Newton and Gottfried Liebnitz developed calculus independent of each other Charles Darwin and Alfred Wallace independently developed versions of the theory of evolution via natural selection Different inventors in different countries claim to have invented the lightbulb (Thomas Edison in the USA, Joseph Swan in the UK, Alexander Lodygin in Russia) Alexander Graham Bell and Elisha Grey submitted nearly simultaneous patent applications for the invention of the telephone In 1922, Ogburn and Thomas compiled a list of nearly 150 examples of multiple independent discovery (often called “twin” discoveries or “multiples); wikipedia provides many more. These exercises are meant to show that once a new invention or discovery is “close” to existing knowledge, then multiple people are likely to have the idea at the same time. It also implies scientific and technological advance have some built in redundancy: if Einstein had died in childhood, someone else would have come up with relativity. But in fact, all these lists of anecdote show is it is possible for multiple people to come up with the same idea. We don’t really know how common it is, because these lists make no attempt to compile a comprehensive population survey of ideas. What do we find if we do try to do that exercise? Simultaneous Discovery in Papers and Patents A number of papers have looked at how common it is for multiple independent discovery to occur in academic papers. An early classic is Hagstrom (1974), which reports on a survey of 1,947 academics in the spring of 1966. Hagstrom’s survey asked mathematicians, physicists, chemists, and biologists if they had ever been “anticipated”; today, we would call this getting scooped. Getting scooped isn’t that uncommon: 63% of respondents said they had been scooped at least once in their career, 16% said they had been scooped more than once. For our purposes, the most illuminating question in Hagstrom’s survey is “how concerned are you that you might be anticipated in your current research?” Fully 1.2% of respondents said they had already been anticipated on their current project! Let’s assume people are, on average, halfway through a research project. If they have a constant probability of being scooped through the life of a project, then that implies the probability of getting scooped on any given project is on the order of 2.5%, at least in 1966. Hill and Stein (2020) get similar results, studying the impact of getting scooped over 1999-2017 for the field of structural biology. Structural biology is a great field for studying how science works because of its unusually good data on the practice of science. Structural biologists try to figure out the 3D structure of proteins and other biological macromolecules using data on the diffraction of x-rays through crystalized proteins. When they have a model that fits the data well, the norm (and often publication requirement) is to submit the model to the Protein Data Bank. This submission is typically confidential until publication, but creates a pre-publication record of completed scientific work, which lets Hill and Stein see when two teams have independently been working on the same thing. Since almost all protein models are submitted to the Protein Data Bank, Hill and Stein really do have something approaching a census of all “ideas” in the field of structural biology, as well as a way of seeing when more than one team “has the same idea” (or more precisely, is working on closely related proteins). Overall, they find 2.9% of proteins involve multiple independent discovery, as defined above, quite close to what Hagstrom reported in 1974. Painter et al. (2020) takes yet another approach to identifying multiple simultaneous invention, this time in the field of evolutionary medicine (2007-2011). Their approach is to identify important new words in the text of evolutionary medicine articles, and then to look for cases where multiple papers introduce the same new word at the roughly the same time. In their context, this usually means an idea has been borrowed from another field (where a word for the concept already exists) and they are looking for cases where multiple people independently realized a concept from another field could be fruitfully applied to evolutionary medicine. To identify important new keywords, they take all the words in evolutionary medicine articles and algorithmically pick out the ones unlikely to be there based their frequency in American English. This gives them a set of technical words that are not common English words. They build up a dictionary of such terms mentioned in papers published between 1991 and 2006; these are words that are “known” to evolutionary biology in 2007. Beginning in 2007, they look for papers that introduce new technical words. Lastly, they consider a word to be important if it is mentioned in subsequent years, rather than once and never again. Over the period they study, there were 3,488 new keywords introduced that went on to appear in at least one subsequent year. Of this set, 197 were introduced by more than one paper in the same year, or 5.6%. As a measure of independent discovery, that’s probably overstated, since it doesn’t correct for the same author publishing more than one paper using the same keywords. Again, I think something in the ballpark of 2-3% sounds plausible. Painter and coauthors go on to focus on a small subset of 5 keywords that were simultaneously introduced by multiple distinct people and which were very important, being mentioned not just again, but in every subsequent year. Bikard (2020) is another attempt to identify instances of multiple independent discovery, though in this case it’s harder to use the data to estimate how common they are. Bikard argues that when the same two papers are frequently cited together in the same parenthetical,1 then that is evidence they refer to the same underlying idea. Bikard algorithmically identifies a set of 10,927 such pairs of papers in the PubMed database and shows they exhibit a lot of other hallmarks of being multiple independent discoveries: they are textually quite similar, published close in time, and frequently published literally back-to-back in the same journal issue, which is one way journals acknowledge co-discovery. Given 29.3 million papers in PubMed, if there are only 10,927 instances of multiple discovery, that would naively suggest something on the order of 0.03% of papers having multiple independent discovery. But while Bikard’s publicly available database of twin discoveries is useful for investigating a lot of questions related to science, it’s less useful for ascertaining the probability of independent discovery. That’s because the algorithm requires articles to have the right mix of characteristics to be identified as simultaneous discoveries. For example, in order to identify if two articles are frequently cited together in the same parenthetical block, Bikard needs each paper to receive at least 5 citations, and he needs at least three papers that jointly cite them to have their full text available, so he can see if those citations happen in sequence inside a parentheses. It’s unclear to me how many of the 29.3mn papers in PubMed meet this criteria. But we can at least say that as long as no less than 1 in 100 papers meet the criteria, then Bikard’s method suggests a rate of simultaneous discovery that is significantly lower than 3%. To close out this section, let’s turn to patents. Until 2013, the US patent system featured an unusual first-to-invent system wherein patent rights were awarded not to the first person to seek a patent but to the first person to invent it (provided certain conditions were met). This meant that if two groups filed patents for substantively the same invention, the US patent office initiated something called a “patent interference” to determine which group was in fact the first to invent. These patent interferences provide one way to assess how common is simultaneous invention at the US patent office. Ganguli, Lin, and Reynolds (2020) have data on all 1,329 patent interference decisions from 1998-2014. Of this set, it’s not totally clear how many represent actual simultaneous invention. In a small number of cases (3.5%), the USPTO ruled there had in fact been no interference, but in some cases one party settles or abandons their claim, or ownership of the patents is transferred to a common owner. In these cases, we don’t know necessarily know if the patents were the same. But it turns out this doesn’t really matter for making the argument that simultaneous invention is very rare. For the sake of argument, let’s assume all 1,329 patent interference decisions correspond to cases of independent discovery. On average, it takes a few years for a patent interference decision to be issued. So let’s assume, for the sake of argument, these decisions come from the set of granted patents whose application was submitted between 1996 and 2012. Some 6.3mn patents applications (ultimately granted) were submitted over this time period, which implies 0.02% of patent applications face simultaneous invention. That’s a lot less than the 2-3% we found in some academic papers! Inferring the Probability of Rediscovery All these approaches suggest simultaneous discovery is in fact not very common. But simultaneous discovery i...
How common is independent discovery?
Audio: How common is independent discovery?
Audio: How common is independent discovery?
This is an audio read-through of the initial version of How Common is Independent Discovery? Like the rest of New Things Under the Sun, the underlying article upon which this audio recording is based will be updated as the state of the academic literature evolves; you can read the latest version here.
Audio: How common is independent discovery?
The psychology of unfinished tasks
The psychology of unfinished tasks
Unfinished tasks can feel overwhelming, leading to procrastination and slowing your progress. On the other hand, the annoyance of having all of these unfinished tasks on your to-do list may motivate you to tackle them at the next opportunity. These contradictory experiences are due to two effects: the Zeigarnik effect and the Ovsiankina effect. A productive psychological tension In 1927, psychologist Bluma Zeigarnik reported that individuals tend to have a better memory for tasks that are interrupted or incomplete, than they do for tasks that have been completed. Zeigarnik and her supervisor, professor Kurt Lewin, observed that their restaurant waiter had an exceptional memory for what everyone at the table had ordered, despite never writing anything down. However, it later emerged that he only retained the information until each table left. After this, he would have little or no memory of the customers, which table they had been sat at, or what they had ordered. Following this encounter, Zeigarnik carried out a series of experiments on the relationship between tasks and memory. She concluded that it’s possible for the human memory to distinguish between tasks that have been completed and those that are still left to complete, and that we tend to remember unfinished tasks better. This phenomenon became known as the Zeigarnik effect. According to Zeigarnik’s research, an unfinished task will remain prominent in our minds because we know that we have left it incomplete. Zeigarnik explained that each task we start produces a form of psychological tension. If we’re interrupted partway through the task by a phone call or meeting, the tension of the task remains prominent in our minds. This means that when we return to it, the information is still present. The psychological tension, and our recall of relevant information, will therefore only fade once the task is completed. One of Zeigarnik’s colleagues, Maria Ovsiankina, investigated the impact of interruptions on productivity. In 1928, Ovsiankina found that, compared to a task that has not yet been started, individuals have a stronger urge to complete interrupted or unfinished assignments. The Ovsiankina effect describes a state in which not completing a task leads to intrusive thoughts, creating a strong desire to complete the brief. This means that starting a project may increase your desire to finish it, because procrastinating and leaving it unfinished feels unpleasant.  Ovsiankina therefore showed that even if you know you don’t have time to complete something all at once, it may still be worth making a start. Once a project is underway, your dedication to completing it will increase. Using unfinished tasks as a productivity tool By supporting our short-term memory and encouraging completion of an activity, unfinished tasks can be useful as a productivity tool. However, it only works if you don’t leave tasks hanging over you for too long. For instance, the Zeigarnik effect can subject us to the “Tyranny of the Shoulds”, as described by psychotherapist Karen Horney, in which we compare who we are (the real self) with who we feel we should be (the ideal self). If we leave tasks unfinished for too long, the resultant rumination or anxiety can impact our self-esteem. The Ovsiankina effect can also lead to a “quasi-need”, or a need that is not essential but nevertheless pulls our attention. Psychologists Oliver Weigelt and Christine Syrek discovered that leaving assignments unfinished over the weekend causes people to ruminate on the unfinished tasks, which leads to difficulty switching off from work. The researchers found that spending a little time over the weekend finishing tasks or preparing for the following week could prevent rumination and stress. Describing it as “closure”, they noted that ticking a task off the list then made it easier for people to enjoy their remaining time off. Although there are drawbacks associated with unfinished tasks, they can be used to boost memory and encourage task completion. The following steps will help you develop a strategy for using unfinished tasks to your advantage: Start even if you can’t finish. It may feel more productive to wait until you have enough time to complete a task in its entirety. However, the psychology of unfinished tasks suggests that it’s better to start working on a task, even if you can’t finish it in one go. Once started, you will feel more inclined to finish the job at the earliest opportunity. Follow the ten minute rule. Fight procrastination by talking yourself into getting started with the ten minute rule. There’s a good chance that once you get started, you’ll keep going for longer than ten minutes. And even if you don’t, the combined power of the Zeigarnik effect and the Ovsiankina effect will make it more likely you will finish the task later. Take breaks. Taking breaks helps restore your motivation, prevent decision fatigue, consolidate your memories, increase your creativity, and improve your well-being. In addition, the Zeigarnik effect shows that your mind will naturally work to retain information when you take regular breaks, therefore boosting your productivity. And when a task is left unfinished, the Ovsiankina effect will draw you back to ensure you finish the job. Critically appraise your tasks. If you notice that despite applying these strategies you still have tasks that are left unfinished for too long, consider whether these tasks are a priority. Use the Eisenhower matrix or the MoSCoW method of prioritisation to delete or delegate some of these tasks. Practise self-compassion. The downside of the Zeigarnik and Ovsiankina effects is that an unfinished task can cause stress and anxiety through intrusive thoughts. Don’t beat yourself up when you have a long list of unfinished tasks. Instead, be kind to yourself and practice mindfulness through journaling, meditation, and exercise. The Zeigarnik and Ovsiankina effects can be useful productivity tools. Rather than procrastinating or leaving tasks incomplete, these effects encourage us to pick up unfinished tasks. However, when not managed correctly, these psychological phenomena can lead to cognitive dissonance and intrusive thoughts. Apply a strategic approach to your unfinished tasks and don’t forget to practice self-care, so you can make the most of the Zeigarnik and Ovsiankina effects without sacrificing your mental health. The post The psychology of unfinished tasks appeared first on Ness Labs.
The psychology of unfinished tasks
How to access paywalled research papers without institutional access
How to access paywalled research papers without institutional access
The Internet is full of extraordinary allegations, promises of breakthrough discoveries, and content promoting new, innovative products. Some of these claims are supposedly based on scientific evidence, linking to research which you are told to read. So, you look it up, but the papers are hidden behind paywalls. What should you do? One option is to accept all those claims at face value. In a perfect world where everyone was an honest citizen driven by doing the most good, this could be a viable approach. However, whether it is because of ulterior motives or laziness, a lot of information published online is not based on strong evidence. Linking to a paper doesn’t guarantee you can trust that what you’re reading is a true reflection of the original results, nor that the study constitutes strong evidence in the first place. In order to assess whether you can trust a claim, you need to access the original paper. Fortunately, even when the paper is paywalled, there are a few workarounds. An opportunity to nurture your network Paywalls are a pain, there is no denying that. However, you can turn your annoyance into an opportunity for growth by either making the most of your existing network, or expanding your network to include more experts in the fields you are curious about. Here are some ways you can access paywalled research papers that will help you nurture your professional network: Contacting the author(s). All papers, including paywalled papers, display the contact information for the corresponding author on the page. Send them a polite email asking if they can send you a PDF version of the paper. Most researchers will be happy to oblige, and even flattered that someone is taking an interest in their research, so you don’t even need to give a justification. Obtaining alumni access. If you went to university, check with your alumni association whether they have a programme that gives alumni access to library resources. For example, many universities around the world give access to JSTOR as a perk of joining the alumni association. Getting a courtesy appointment. Do you have any connections with an educational institution? It doesn’t have to be your former university. You can have a look at your local institutions, and offer to help with some projects or provide a few hours of adjunct teaching per month in exchange for a courtesy affiliation that will come with login credentials. Of course, some of these workarounds are quite tedious, and you may not be interested in growing an academic network. Some other options are quicker and easier to implement. Finding a freely available copy A lot of paywalled research papers are also available freely somewhere else on the web. It may not always be the final, publisher-approved version, but you will still get access to the results you want to read for yourself. Accessing the self-archived version. Many researchers post their papers on their own website, on their research institution’s website, or on self-archival websites such as Academia and ResearchGate. Instead of manually checking each of these, you can go to Google Scholar, paste the title of the article, and then check the “all versions” link. If it says “PDF” next to any of the versions, bingo! You’ve found yourself a freely available copy. Downloading the preprint. If you can’t find a freely available copy of the peer-reviewed article on Google Scholar, you may still be able to find the corresponding preprint. While it may slightly differ from the final, peer-reviewed version, the data should still be the same. Just make sure you are downloading the latest preprint. Some preprint servers include arXiv, Cogprints, and PeerJ. Going to a public library. If you want an excuse to take a walk, look no further. This is an old-school workaround, but most public libraries will give you onsite access to many research databases, though you may need a library card in some cases. If you are pressed for time, download the research you need and read it later in the comfort of your home. Installing a browser extension. Even easier, you can install Unpaywall to do all of the hard work for you. It harvests content from thousands of university and government websites from all over the world, and will tell you if there is a freely available copy somewhere, whether it’s a preprint or an author’s self-archived version. Open Access Button is another extension that looks for the open access version, or sends a request to the author. You can find many more open access journals and repositories by searching the Directory of Open Access Journals (DOAJ) and the Directory of Open Access Repositories (openDOAR). For instance, even if it’s published in paywalled journals, research funded through the National Institutes of Health must also be published through PubMed, a repository of freely available scientific papers. This guide would not be complete without mentioning Sci-Hub. Created by Alexandra Elbakyan (read our interview), Sci-Hub will let you access almost any academic paper. As it’s not legal in many countries, we cannot fully endorse it and this is for informational purposes only Open science where anyone could access any research at any time would greatly contribute to fostering scientific collaboration, nurturing our collective intelligence, and accelerate humanity’s pace of discovery. The current model is antiquated and new models are being designed as you read this. In the meantime, there are many workarounds to access paywalled research papers without institutional access. Give them a go, and have fun expanding your knowledge and your network! The post How to access paywalled research papers without institutional access appeared first on Ness Labs.
How to access paywalled research papers without institutional access
June 2022 Updates
June 2022 Updates
New Things Under the Sun is a living literature review; as the state of the academic literature evolves, so do we. This post highlights three recent updates. How Distortionary is Publish-or-Perish to Science? As I wrote earlier this month, science appears to be getting harder. One possible cause of this is increasing competition and the incentive to publish. Maybe scientists can only keep up in the publishing race by doing increasingly slap-dash work? The article Publish-or-perish and the quality of science looked at some evidence on this in two very specific contexts where we have exceptionally good data. A new update adds in some papers that rely on poorer quality data, but which are able to assess a much wider set of contexts: We can find complementary evidence in two additional papers that have far less precision in their measurement but cover much larger swathes of science. Fanelli, Costas, and Larivière (2015) and Fanelli, Costas, and Ioannidis (2017) each look for statistical correlations between proxies for low quality research and proxies for pressure to publish. When we zoom out like this though, we find only mixed evidence that publication pressures are correlated with lower quality research. Fanelli, Costas, and Larivière (2015) look at the quality of research by focusing on a rare but unambiguous indicator of serious problems: retraction. If we compare authors who end up having to retract their papers to those who do not, do we see signs that the ones who retracted their papers were facing stronger incentives to publish? To answer this, Fanelli, Costas, and Larivière (2015) identify 611 authors with a retracted paper in 2010-2011, and match each of these retracted papers with two papers that were not retracted (the articles published immediately before and after them in the same journal). Fanelli, Costas, and Ioannidis (2017) look at a different indicator of “sloppy science.” Recall in Smaldino and McElreath’s simulation of science, one aspect of a research strategy was the choice of protocols you used in research. Some protocols were more prone to false positives than others, and since positive results are easier to publish, labs that adopt these kinds of protocols accumulate better publication records and tend to reproduce their methods. This form of publication bias leads statistical fingerprints that can be measured.undefined Fanelli, Costas, and Ioannidis (2017) tries to measure the extent of publication bias across a large number of disciplines and we can use this as at least a partial measure of “sloppy science.” Each of these papers then looks at a number of features that, while admittedly crude, are arguably correlated with stronger incentives to publish. Are the authors of retracted papers more likely to face these stronger publication pressures? Are the authors of papers that exhibit stronger signs of publication bias more likely to face them? One plausible factor is the stage of an author’s career. Early career researchers may face stronger pressure to publish than established researchers who are already secure in their jobs (and possibly already tenured). And indeed, each paper finds evidence of this: early career researchers are more likely to have to retract papers and showed more evidence of publication bias, though the impact on publication bias was quite small. Another set of variables is the country in which the author’s home institution is based, since countries differ in how academics climb the career ladder. Some countries offer cash incentives for publishing, others disburse public funds to universities based closely on the publication record of universities, and others have tenure-type systems where promotion is more closely tied to publication record. When you sort authors into groups based on the policies of their country, you do find that authors in countries with cash incentives for publication are more likely to retract papers than those working in countries without cash incentives. But that’s the strongest piece of evidence based on national policy that publication incentives lead to worse science. You don’t observe any statistically significant difference between authors in these cash incentive countries when you look at publication bias. Neither do you see anything when you instead put authors into groups based on whether they work in a country where promotion is more closely tied to individual performance. And if you group authors based on whether they work in a country where publication record plays a large role in how funds are distributed, you actually see the opposite result than expected (authors are less likely to retract and show less signs of publication bias, when publication records matter more for how funds are disbursed). A final piece of suggestive evidence is also interesting. In Smaldino and McElreath, the underlying rationale for engaging in “sloppy science” is to accrue more publications. But in fact, authors who publish more papers per year were less likely to retract and their papers either exhibited less bias or no statistically different amount (depending on whether the first or last author is assigned to a multi-authored paper). There’s certainly room for a lot of interpretations there, but all else equal that’s not the kind of thing we would predict if we thought sloppy science let you accrue more publications quickly. Read the whole thing for my view on how all this literature fits together. But the short version is I think publish-or-perish, on average, probably introduces real distortions, but they aren’t enormous. Read the Whole Thing Measuring the Impact of Strange Combinations of Ideas A classic school of thought in innovation asserts that the process of innovation is fundamentally a process of combining pre-existing concepts in new and novel ways. One claim from this school of thought is that innovations that make particularly surprising combinations should be particularly important in the history of innovation. The article The Best New Ideas Combine Disparate Old Ideas looked at a bunch of evidence consistent with this claim, at least in the context of patents and papers. I’ve updated this article with two papers that provide new ways to measure this, in the context of academic papers. The first is by Carayol, Lahatte, and Llopis (2019): Carayol, Lahatte, and Llopis (2019) investigate this by using the keywords that authors attach to their own manuscripts as proxies for the ideas that are being combined. For a dataset of about 10 million papers published between 1999 and 2013, they look at each pair of keywords used in each paper, comparing how many other papers use the same pair of keywords as compared to what would be expected if keywords were just assigned randomly and independently. Using this metric of novelty, they find the more novel the paper, the more citations it gets and the more likely it is to be among the top 5% most cited. In the figures below, papers are sorted into 100 bins from least novel (left) to most novel (right), and the average citations received within 3 years or the probability of being among the top 5% most cited papers for papers in the same centile is on the vertical axis. From Carayol, Lahatte, and Llopis (2019) The second paper brings in a new way to measure the impact of unusual combinations, rather than a new way of measuring how ideas are combined or not combined. [A]s with patents, it would be nice to have an alternative to the number of citations received as a measure of how important are academic papers that combine disparate ideas. Lin, Evans, and Wu (2022) provide one such alternative by comparing how disruptive a paper is and how unusual are the combinations of cited references. Intuitively, disruption is about how much your contribution renders prior work obsolete, and a new line of papers attempt to measure this with an an index based on how much your work is cited on it’s own, and not in conjunction with the stuff your work cited. This is distinct from simply the number of citations a paper receives. You can be highly cited, but not highly disruptive if you get a lot of citations, but most of them also point to one of your references. And you can also be highly disruptive without being highly cited, if most of the citations you do receive cite you and only you. Lin, Evans, and Wu (2022) measure unusual combinations of ideas in the same way as Uzzi, Mukherjee, Stringer, and Jones and (among other things) compare the extent to which a paper makes unusual combinations to how disruptive it is. They find papers citing conventional combinations of journals are disruptive 36% of the time, whereas papers citing highly atypical combinations of journals are disruptive 61% of the time. In this context, a paper is disruptive if it receives more citations from papers that only cite it than citations from papers that cite both it and one of its references. That suggests unusual combinations are particularly important for forming new platforms upon which subsequent papers build. Read the Whole Thing A Bias Against Novelty Lastly, the article Conservatism in science examined a bit of a puzzle: scientists are curious people, so why would they appear to exhibit a bias against novel research? One strand in that argument was a paper by Wang, Veugelers, and Stephan, which presented evidence that papers doing highly novel work eventually get more citations, but are less likely to be highly cited by people in their own discipline, and take longer to receive citations. But that paper was inevitably based on just one sample of data using one particular measure of novelty. Carayol, Lahatte, and Llopis (2019) (discussed previously) provides an alternative dataset and measure of novelty that we can use to assess these claims. In the updated piece, I integrate their results with Wang, Veugelers, and Stephan. …Suppose we’ve recently published an article on an unusual new idea. How is it recei...
June 2022 Updates
Get smarter everyday with Vladimir Oane founder of Deepstash
Get smarter everyday with Vladimir Oane founder of Deepstash
Welcome to this edition of our Tools for Thought series, where we interview founders on a mission to help us make the most of our mind. Vladimir Oane is the founder of Deepstash, a curation platform to become more inspired and productive through bite-sized ideas. Deepstash is built on the belief that ideas are the building blocks of the world and the Lego bricks for your mind. It allows you to nurture your mind by “stashing” ideas, either from your own reading sources, or from the Deepstash community of curators. In this interview, we talked about the nature of ideas, how curation is an act of creation, the many benefits of curating ideas, an antidote to the problem of endless scrolling, and the power of referencing ideas in your creative work. Enjoy the read! Hi Vladimir, thank you so much for agreeing to this interview. Let’s start with a bit of a philosophical question: why do you think ideas are fundamental? Carl Yung once said: “We don’t have ideas. Ideas have us.” At Deepstash we believe ideas are fundamental. They helped us transcend our animal nature, shape our behaviour and come together as a group. All personal and collective progress can be traced back to an idea. Ideas are the building blocks of the modern world. They’re the seeds of everything. Right now ideas are trapped inside books or other long-form content which makes them hard to grab and use in our daily lives. Reading can be fun in and of itself, but the magic happens when we can relate to what we’re reading. Unfortunately, social media usually tries to distract us with divisive news and silly entertainment, stuff which we can’t ultimately connect to on a deeper level.  In the words of Eleanor Roosevelt: “Small Minds Discuss People. Average Minds Discuss Events. Great Minds Discuss Ideas”. At Deepstash we are setting ideas free. We believe that ideas are the seeds of everything and we want to see them burst and bloom.  We are taking the lead in sharing and discussing ideas. We are reinventing social media, but better, around and for ideas. And that’s what inspired you to create Deepstash. What me and my colleagues discovered is that a lot of our work is knowledge work. Things change so rapidly that it becomes imperative to stay on top of things. Constantly learning and re-learning is becoming a keystone skill in the 21st century. Personally, I’ve always been the type of guy who reads with a pen and who organises his insights in complex structures, using all sorts of note-taking tools (some of which you are reviewing here at Ness Labs). I’ve always seen my personal knowledge library as one of my main assets. Beneficial as it may be, we realised that such a system is cumbersome to create and most wouldn’t bother. It’s too complex. So we started Deepstash with this desire to give all people in the world idea-mixing superpowers. Ease of use on the go and social presence were our bets that would get people to connect through ideas. 2 million users later we seem to be on to something. That sounds great, how does it work exactly? You start your Deepstash journey by setting up some topics of interests and maybe following some interesting curators. Tailored content recommendations from the ever-growing library of knowledge are then delivered to you on a daily basis. You will find inspiring quotes, meaningful books, explanations of abstract concepts or thought-provoking opinions. And you can stick to your preferred topics or you can explore a myriad others. Content on Deepstash is short and actionable so there is a high chance you will stumble on an idea that will mean a lot to your career or personal life. You can save it with a tap, something we call stashing. You can also group more ideas into stashes. In time you will build quite a library of ideas you can rely on and use it whenever you have to write a work presentation, a paper in college, give a speech or take your first steps in investing in crypto. You can also share ideas with friends and colleagues and chat about them. It is quite easy, as it all happens on your mobile.  The term “idea” may sound vague to some people — what exactly do you mean by an idea? Ideas are atomic pieces of knowledge. This means they are short so they are easy to grab. And they are more actionable as they usually refer to one thing and one thing only. An idea can be an inspiring quote, a practical method like the 7 min workout, an explanation for a concept like the compound effect, a story like the parable of the horse, or facts about sleep. Ideas are created by the users in Deepstash, of whom you can be a part of. You can share, discuss and follow collections of ideas on certain topics. You have access to a global library which you can explore by top creators, topics, recommended content and connect all of them into your stashed ideas. People intuitively know that it’s helpful to be regularly exposed to new ideas, but sometimes we get stuck inside our own little bubble. How does Deepstash address this challenge? One one hand, Deepstash is all about exploration. Our format and structure makes discovering new things very easy.  But I would say the dynamics of a product like Deepstash are different from a traditional social network where people mostly share news. On Deepstash people share ideas from books, articles and podcasts. So the cool thing is that even if you get lost down a rabbit hole and you realise two hours later that you’re still in the app, it’s not like on social media where you feel you wasted time, instead you actually feel smarter.  You also get new content suggestions every day based on your readings and favourite topics, and you get to follow your friends’ readings and that of top curators (people that might read or listen to more content than you, whose selection of ideas you can instantly access). It’s like a shortcut into someone’s brain. Talking about curators, can you tell us more about the benefits of curating ideas with Deepstash? Ideas are curated, not summarised. An idea arises when the author’s words meet the prior knowledge, experience and interests of the curator. Thus curation is an act of creation. For example, many people have stashed ideas from Atomic Habits but each take is unique. It’s like reading the same book, but getting 1.000 different insights in an instant. Anyone can create ideas on Deepstash and curating ideas is awesome. First, it makes you understand the content you are consuming. And by putting into words what you found inspiring, makes you understand the concepts and even yourself better. It also helps you remember what you read. It helps you keep your insights organised and, in time, develop your knowledge library into an asset. Finally, it helps you spread the findings to other people who share your interests.  What are some of the most popular ideas on Deepstash? We have so many. Here are some to get you started: Making the best out of every day: How to Have a Good Day by Caroline Webb How to form good habits and break bad ones: Atomic Habits by James Clear On managing our inner saboteur:  Banish Your Inner Critic by Denise Jacobs How fixed and growth mindsets impact our lives from childhood through adulthood: Mindset: The New Psychology of Success by Carol S. Dweck Writing as a thinking tool: How to Use Writing to Sharpen Your Thinking by Tim Ferriss Becoming a memory master: 4 Ways To Hack Your Memory by Lisa Genova Harnessing the power of defaults: How to Make Smarter Decisions by Designing Your Defaults by Dan Silvestre An ocean of ideas to explore! With such diversity, what kind of people use Deepstash? Most of our users are busy knowledge workers. Curious minds. College educated young creatives working in industries like tech, marketing or HR.  We have lots of anecdotes from our users on the amazing use cases they have for Deepstash. One professor used Deepstash in the classroom to get his students to debate ideas. A product guy used Deepstash for his 1-1’s with his manager, using ideas as topics for a structured and meaningful conversation related to his work and career development. A customer support person used some ideas he got from a book to pitch some new policies for setting up company meetings. Or a young Indian activist shared tons of resources related to the ecological cause he was fighting for. Anecdotes aside, a common feedback we get from our community is how Deepstash is an antidote to the endless scroll. As one user put it: “Many people use social media as a way to stay in touch, learn about new ideas and interesting plans, or produce thoughts they’re thinking, but can’t put into words. This app allows for users to read easily digestible articles that can truly improve your life. Not only does it offer some great advice and insights, but it makes me feel better about who I am because I’m not wasting time reading into other people’s lives on the Gram or Tweeter.” In Deepstash, your scroll is always followed by a new insight, a bit of knowledge to get you started on your day. What about you… How do you personally use Deepstash? I am probably not unique in wearing different hats. I run a startup, I am quite obsessed with product building, innovation, and knowledge management. I am trying my best to create an amazing company culture, all while trying to keep my cool with Zen meditation and reading science fiction. What can I say? I am as curious as any of you reading this interview. My main use case is collecting key insights from all the books I am reading, podcasts I am listening to or videos I am watching. I am a very active curator and you can check my profile to see all the stuff I am into. The constraints of the format means I have to engage with the content on a deeper level so that I can compress findings into ideas, which helps tremendously with internalising the information. Deepstash will then make it available to me with a tap of a button so I can rediscover it later. This way I never worry ab...
Get smarter everyday with Vladimir Oane founder of Deepstash
Temptation bundling: how to stop procrastinating by boosting your willpower
Temptation bundling: how to stop procrastinating by boosting your willpower
You know you should be working on that presentation, but you’ve been procrastinating. To make things worse, the latest season of your favourite show has just dropped on Netflix. Luckily, making progress on your work and indulging in activities you enjoy is not only compatible, it can make you more productive. That’s called temptation building. Temptation bundling is a productivity technique that involves combining an activity that gives you instant gratification, such as watching TV, with one that is beneficial but has a delayed reward, such as exercising. If you only allow yourself to watch TV while you’re on a treadmill, you may be more likely to exercise regularly than you would otherwise have been.  Temptation bundling can help you to avoid procrastination, reduce short-sighted decision making, and improve both your physical and mental health. Let’s have a look at the scientific evidence for temptation bundling, and explore how you can use this technique to boost both your productivity and your health. The science of temptation bundling The term temptation bundling was first coined in 2014 by Professor Katherine Milkman of Wharton University. As part of her study, Milkman noted some worrying statistics. In the USA, 68% of adults had been classed as overweight or obese in 2008, and 112,000 Americans were dying each year as a result of obesity and its complications. Milkman therefore emphasised that promoting weight loss was an urgent priority for public health. Alongside researchers Julia A. Minson and Kevin G. M. Volpp, Milkman began to investigate temptation bundling in relation to exercise. The team hypothesised that if this productivity tool could increase the chance of an individual making wise or healthy choices such as regular exercise, then it could lead to health improvements and weight loss.  Milkman designed a study to ascertain whether an individual’s drive to exercise would increase if they were given a page-turner audiobook that they could only listen to while at the gym. She described this as bundling instantly gratifying but guilt-inducing ‘want’ experiences with valuable ‘should’ experiences, which may increase an individual’s commitment to the ‘should’ activity.  The results showed that participants within the temptation bundling group visited the gym 51% more frequently than the control group during the 10-week study. The desire to carry on listening to a great audiobook had boosted their commitment to visiting the gym. Combining a ‘want’ with a ‘should’ increased the chance that a participant would make the healthy decision to exercise. However, Milkman noted that as with many exercise programmes, gym visits declined towards the end of the study. This was particularly true after the Thanksgiving holiday period which fell between weeks 7 and 8 of the study and interrupted participants’ exercise routines. The researchers concluded that temptation bundling may become less effective over time or following changes to normal routines, and that incentives to return to the gym following periods of abstinence could be helpful, for example by offering a new audiobook.  However, the researchers also noted that audiobooks don’t suit everyone, and that alternative forms of hedonism, such as the option to watch TV at the gym, may be a better incentive for some. As with many productivity techniques, the general principles need to be adapted to people’s specific needs. Which works great in the case of temptation bundling, a versatile technique you can use to combine your temptations with your goals. The many faces of temptation bundling Exercising may not feel good at the time, but combining it with a gripping audiobook makes it feel more enjoyable. This means you are far more likely to exercise than if the incentive of listening to more of a mesmerising novel were not there. As Milkman illustrates, temptation bundling is usually extremely inexpensive, making it a more sustainable way to commit to ‘should’ tasks. Giving yourself permission to watch TV only if you complete your ironing while watching costs nothing, but helps ensure this chore gets completed.  To get your other household chores done, you could only listen to your favourite podcast or catch up on a favourite radio show when you are cleaning, washing up or doing laundry. If you know that you cannot listen to the show at any other time, you’re far more likely to get these chores done as you will be instantly rewarded while doing so. TV and audiobooks might not feel as tempting for everyone, but there are many other examples of pleasurable activities that could form part of a temptation bundle. If you enjoy getting your hair cut or having a pedicure, use this time to catch up on overdue work emails or other admin such as insuring your car. This way, you can use an enjoyable experience to encourage the completion of tedious, but necessary tasks. If you have to meet up with a difficult colleague or relative, do so at your favourite coffee shop. The lure of the nice coffee will help reduce the chances that you procrastinate on the difficult meeting, but will also make the meeting experience less stressful in general. As you can see, temptation bundling works by using so-called guilty pleasures, such as getting a pedicure, indulging in TV shows, or eating out, as you complete the tasks that are less enjoyable. Fulfilling these tasks may not bring you immediate joy, but will contribute to your long-term goals in your professional and personal life. How to create your own temptation bundle Temptation bundling is a great way to increase your willpower to ensure you make sensible decisions and maximise your productivity. In the words of Erika Kirgios and colleagues including Katherine Milkman, summarised that temptation bundling “combats present bias by making behaviours with delayed benefits more instantly-gratifying.” You will then reap the rewards of your sensible choices in future. To be effective, your bundle will need to appeal to your individual taste. Follow these simple steps to create effective temptation bundles that are personal to you: Create a two-column list. In one column, write down all the activities that bring you joy or that you find relaxing, such as watching TV, reading, or listening to podcasts. In the second column, list all of the tasks and behaviours that are less enjoyable or that you are prone to procrastinate over, such as exercise or chores. Combine ‘wants’ with ‘shoulds’. After you have taken your time to write the two lists, you can start browsing them to make suitable combinations of gratifying ‘want’ behaviours and necessary ‘should’ activities. Check for conflict. It is important to make sure that the two items do not physically conflict with each other. You must be able to effectively perform both behaviours at the same time. For example, trying to reply to important work emails while watching one of your favourite TV shows may not be the best combination, as your concentration levels are likely to be affected. Temptation bundling is a simple yet effective productivity technique that combines a hedonistic activity with a chore to offer short-term gratification with long-term gain. It will help you reduce procrastination and increase your overall efficiency. By taking the time to reflect on your own ‘wants’ and ‘shoulds’, you can start to make wiser decisions and even boost your mental and physical health. The post Temptation bundling: how to stop procrastinating by boosting your willpower appeared first on Ness Labs.
Temptation bundling: how to stop procrastinating by boosting your willpower