Connected: The Fascinating Future of Brain-Computer Interfaces

Author’s Note: Article originally published in BrainWorld Magazine.

Screen Shot 2018-08-19 at 12.00.30 AM

Screen Shot 2018-08-19 at 12.05.37 AM

Advertisements

Technology and the Human Brain

My original article can be found on BrainWorld Blog.

Being wired to technology isn’t necessarily good for our neural wiring. Multitasking—which many people think they do well—is not healthy for neural efficiency. With each task switch, there is a cost, which makes our brains less efficient, and depletes nutrients more readily than if we concentrated on one task at a time.

Further, what most people consider “multitasking”—responding to an email while speaking on the phone, while jotting down notes, while reading an article…is actually a series of task switches they do. Human brains are not wired to multitask well—we are wired to perform one main task at a time. When we override this and try to do several things at once, or try to juggle between multiple tasks, we end up drained, less productive, and increase our cortisol levels, which is known to damage neurons.

An MIT neuroscientist, Dr. Earl Miller, is a world expert on divided attention and multitasking. He says that our brains are just “not wired to multitask well…When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.”

This cognitive cost—called a switch cost in cognitive literature—results in slower performance and decreased accuracy. The switch cost continues even when people are aware beforehand a switch is coming. This indicates that there are many executive control processes at play, including attention shifting, goal setting and retrieval, an inhibition of prior tasks. This is not sustainable by the human brain.

Consider allowing yourself to complete just one task at a time. Respond only to an email, then make that phone call. Allot yourself time limits if that helps you with efficiency. Trying to do otherwise—juggling between multiple tasks simultaneously—will make you less productive and more depleted.

Ever have a few minutes of “down-time”? Maybe you’re sitting at the doctor’s office waiting to be called in, or standing in the Starbucks line waiting for your drink to be made. Notice how people quickly snap out their phones and scroll? Or maybe you don’t because you are doing the same thing.

This constant bombardment of stimulation is unhealthy for your brain. The human brain needs moments of silence, moments of downtime, to function well. Giving your brain a break—from information, articles, social media feeds, music, emails, and everything else—means giving your brain a chance to replenish itself.

These moments of silence can foster creativity, reduce stress levels, and enhance overall brain function. You may think you are doing yourself a favor reading the latest news articles, or scrolling through photos of how a rocket is put together, but in reality, you are overstimulating your brain.

Technology has a great capacity to help us. From providing us information to solve problems, acting as a creative outlet, giving us the freedom to explore new concepts, technology has opened up worlds to us. However, these freedoms and abilities come with a cost. As in anything, moderation is key. Allow yourself some time to “unplug” every day. If not every day, then choose one day a week to unplug for a few hours, if not the whole day. Keep a notebook on how you feel. Are you less stressed? More thoughtful? Perhaps more creative?

I’d bet after a few of these unplugs, you’ll find yourself enjoying your own company, and the company of others, more, and you’ll be far more creative and content in your daily life.

You may even begin craving these moments of unwiring.

Computational Neuroscience/Neural Engineering

As I explore more of what I want to (computational neuroscience, neural engineering, software engineering, etc.) I’m finding out more about what can be done with what degrees.

I really am finding myself interested in the idea of building things: not just software, but AI, neurorobotics, etc.

I assume these all need PhDs. But in what field?

Seems like a neuroscience and computer science/engineering background is a must. Or at least the computer background is. Biomedical/computer engineering are also requisites to seem degree.

So the question is now: what do I get a graduate degree in?

I have a BS in neuroscience, am working towards a second BS in computer science. What do I get my masters and potential PhD in? I was thinking about an MS in computer science with a specialization in machine learning. But now I’m thinking an MS in computer engineering would be good too.

So I did a little research in Indeed.com for computational neuroscience jobs in the DC area.

Here’s what I found:

JobTitle Neuroscientist / Biomedical Engineer
Job Highlights Seeking a Ph.D. in Neuroscience, Biomedical Engineering, Biomechanics, or related field. Candidates should have an interest and ability to conduct research and advise federal partners on topics including blast injury, traumatic brain injury, rehabilitation engineering, human performance monitoring and enhancement, and noninvasive brain-computer interfaces. Secondary topics may include biologically-inspired approaches to artificial intelligence, computer vision, and robotics.
Required Skills: The successful candidate will have expertise in applied science and computer programming (Matlab, Java, C++, Python, or other languages relevant to scientific computing), specifically with experience that may include biomechanical modeling of injury and/or musculoskeletal movement, and computational neuroscience.

Other positions (the few there were) demanded similar credentials, along with PhDs. Looks like I should get that PhD after all…

MIT-inspired Neuroscience Curriculum

I’m thinking about going for my PhD in neuroscience, specifically computational neuroscience. With a BS in neuroscience, and working on my second BS in computer science, I need to do all I can to boost my grad application and make sure I have the coursework that will make me an attractive student.

So, I turned to MIT for their undergraduate and graduate neuroscience curriculums, and created this new self-study curriculum for the next year. (I plan on applying in December 2017, so I’ve got a year to refresh on undergrad neuro and teach myself a few graduate neuro topics/courses).

Introduction to Computer Science Programming in Python
Introduction to Computational Thinking and Data Science
Introduction to Computer Science
Introduction to Neuroscience
Introduction to Neural Computation
Intro to Computer/Electrical Engineering
Sensory Systems
Fundamentals of Biology
Intro to Neurorobotics
Cognitive Neuroscience
Brain Structure and its Origins
Perceptual Systems
Neural Circuits
Neurobiology of Learning and Memory
Cellular Neurophysiology
Animal Behavior
Developmental Neurobiology
Neuroendocrine Systems
Intro to Comp and Elec Problem Solving
Genes, Circuits, and Behavior
The Brain and Its Interface with the Body
Cell NeuroBio
Systems Neuroscience
Cell and Molecular I
Cell and Molecular II

December 2016:

  1. Introduction to Neuroscience
  2. Fundamentals of Biology

January 2017:

  1. Introduction to Computer Science
  2. Introduction to Computational Thinking and Data Science

February 2017:

  1. Introduction to Computer Science Programming in Python
  2. Sensory Systems

March 2017:

  1. Intro to Computer/Electrical Engineering

April 2017:

  1. Intro to Comp and Elec Problem Solving

May 2017:

  1. Introduction to Neural Computation

June 2017:

  1. Cellular Neurophysiology

July 2017:

  1. Intro to Neurorobotics

August 2017:

  1. Cell NeuroBio

The Self-righteous psychology

We all know that person: the Mr.Always-right co-worker, who always thinks he’s got it down and everyone else is wrong. The self-victimizing acquaintance who thinks she treats everyone generously and kindly but who everyone else treats like dirt. The friend you grew up with who thinks he’s reflective and everyone else needs to learn that skill.

At some point in our lives, we will come across the self-righteous person. With their criticism, indignation, and conceit, they tend to grate on our nerves and throw us off our track—if we let them.

But what is the psychology behind the self-righteous personality, and how does it affect you when you have to deal with that BS?

A number of variables grow the self-righteous mind, but a few characteristics are shared by those who think they’re oh-so-good-and-right:

  1. Overgeneralizations: Take a negative incident, throw some magic growth-powder on it, and you have an exaggeration. Look out for “always”, “never”, and “all” from these people, and you could have some self-righteousness burbling in the cauldron.
  2. Positive-discounting: On the flip side of overgeneralizations is taking the stance that positive things, like characteristics of others, aren’t as important. “Hey, you’re nice, but who cares. Being nice is overrated, anyway.”
  3. Jumping to conclusions: We all do it, but the self-righteous person is skilled in this. Conclusions are arrived at, though there is very little to no non-bias to the conclusion. “So-and-so didn’t give me money as a Christmas present this year; they don’t care about me and are cheap. Ugh, I hate cheap people.”
  4. Black-and-white thinking: Either you’re perfect, or you’re not. If you fall short of expectations, it’s because you’re not [insert some quality here], and that’s a reflection of your entirety.

Of course, the self-righteous person doesn’t always have to share these qualities, and their behavior may not draw from these thinking patterns. I know some people who seem self-righteous, and perhaps they are, but it’s due more to social environment than anything else. That’s not to say they haven’t adopted the self-righteous attitude, but I don’t think they would’ve turned out that way had it not been for certain social attitudes around them.

But what makes the self-righteous attitude such a pervasive form of thinking for those who engage in it?

There are a number of reasons based on basic human psychology.

The backfire effect is a relatively common human tendency to protect whatever is added to your collection of beliefs. That is, whatever you decide to believe, you tend to dismiss what doesn’t match up to that perspective, and suck in the “evidence” that matches up to it. So you end up glued to your beliefs and never questioning them, even when new information comes your way that shakes up the foundations of those beliefs—or rather, could shake those foundations if only the backfire effect didn’t get in the way.

The backfire effect is mostly due to cognitive laziness—our brains don’t want to work, so we sink into those explanations that don’t take too much energy to process. The more strenuous it becomes to process things, the less credibility you think they have.

Think you don’t have this tendency yourself, though? Think again.

The next time you have someone praise you, then another person criticize you, explore how you feel. Chances are, a thousand “You’re so smart”, but one “You’re not smart enough” will affect you differently. You’ll let the praise slip right through your mind, but you’ll leech on to the negative comment.

Why?

The backfire effect and another tendency have something to do with it. People tend to spend more time considering information they disagree with than information they accept. Anything that line sup with your way of thinking passes through your processing simply, but anything that threatens your beliefs will grab on to your awareness and hold on. With the backfire effect in play as well, you’ll end up not believing the harder pill to swallow, because it ends up taking too much energy to process. Even if you dwell on the criticism, you’ll fight against believing it and try to find all the ways that that criticism is wrong about you.

Why is this so? Evolution may be at play, here.

Our human ancestors paid more attention to negative stimuli than positive, because if the negative wasn’t addressed, that could mean death.

Biased assimilation has something to do with all this, as well. Kevin Dunbar ran an fMRI experiment where he showed subjects information that confirmed their beliefs about something. Brain regions relating to learning lit up.

But when given contradictory information, those learning brain regions didn’t light up. Rather, brain regions relating to suppression lit up.

In other words, presenting information doesn’t necessarily change the way people think or what they believe. We’re all susceptible to cognitive biases—some of us more so than others.