Computer Science Self-study Model #2

So I’ve found that some of the online free coursework I’ve been going through is not sufficient, or at least not what I wanted/expected.

Interestingly, I found a site that lists and links a number of courses that can earn you a free, self-studied CS degree:

So, change of plans. I am going to go through those courses instead, since they are more comprehensive and will give me the education I need/want. I plan on going through most of the courses, and in the following timeline:

October 2015:

November 2015:

December 2015:

January 2016:

February 2016:

March 2016:

April 2016:

May 2016:

June 2016:

July 2016:

August 2016:

September 2016:

October 2016:

November 2016:

December 2016:

January 2017:

February 2017:

March 2017:

April 2017:

May 2017:

June 2017:

July 2017:

Georgia Tech Online Masters in Computer Science

Georgia Tech has an online Masters in Computer Science (CS) available for about $7k. In considering a career in software engineering, and not having an undergraduate experience in CS, you would still be able to go through the Masters program, if you get a few courses under your belt first.

The program has a number of specializations, but focusing on the Specialization in Computing, these would be helpful courses to take beforehand:

  1. Discrete Mathematics
  2. Intro to Software Engineering
  3. Python programming
  4. Computer Networks
  5. Cybersecurity
  6. Databases
  7. Operating Systems
  8. Hardware/Software
  9. Computer Architecture
  10. Software Management

These would be in conjunction with my previous post outlining self-study in CS.

Self-learning computer science

I’m sure you’ve heard of, the site that hosts a number of university courses on a number of different topics.

Since I don’t have the time or the money to pursue a formal brick-and-mortar degree in computer science (CS), I have decided to create my own CS syllabus.

The following are courses that I think are important enough to take when learning CS and working towards a career in software engineering/development or tech writing. I based my decisions on various people I have asked personally, and my own background research into the field via blogs, forums, Google searches, etc.

If you have opinions on my syllabus, feel free to share.

I’m going to try this course menu out, and let you know how it fares. As I complete each course, I will list the dates it took for me to complete the course, and will cross the class out.

  1. Computer Science 101, Stanford University (6 weeks)  *(I think I can complete this in 3 weeks, however: September 8-September 25.  **I ended up stopping after week 4, as I found the course too simplistic for my needs at this time.)
  2. Computer Science, Harvard University (self-paced)
  3. Intro to Logic, Stanford University (8 weeks) (September 28-November 21)
  4. Logic: Language and Information I, The University of Melbourne (Self-paced)
  5. Logic: Language and Information II, The University of Melbourne (Self-paced)
  6. Algorithms, Part I, Princeton University (6 weeks)
  7. Algorithms, Part II, Princeton University (6 weeks)
  8. Java programming part 1 (pre-req for Algorithmic Thinking)(5 weeks)(September 28-October 30).
  9. Algorithmic Thinking, Part I, Rice University (4 weeks)
  10. Algorithmic Thinking, Part II, Rice University (4 weeks)
  11. Learning to Program, Crafting Quality Code, University of Toronto (10 weeks)
  12. Programming Languages, University of Washington (Self-paced)
  13. Learn to Program: The Fundamentals, University of Toronto (10 weeks)
  14. An intro to Interactive Programming in Python, Part I, Rice University (5 weeks)
  15. An intro to Interactive Programming in Python, Part II, Rice University (5 weeks)
  16. Cryptogaphy I, Stanford University (6 weeks)
  17. Cryptography II, Stanford University (6 weeks)
  18. Game Theory, Stanford University (9 weeks)
  19. Game Theory II, Advanced Applications, Stanford University (6 weeks)
  20. Software Security, University of Maryland (6 weeks)
  21. Hardware Security, University of Maryland (6 weeks)
  22. The Hardware/Software Interface, University of Washington (8 weeks)
  23. Introduction to Databases, Stanford University (Self-paced)

I figure a typical college student will take 3-4 courses per semester. With 21 courses, at 3 courses per semester, that’s 7 semesters, or 3.5 years.

Alternatively, you could do one course per month if self-paced, so 21 months, or about 2 years.

Non-Coursera courses:

  1. Learn C the Hard Way
  2. Learn SQL the Hard Way
  3. Learn Ruby the Hard Way


  1. Codecademy:
  3. JavaScript
  4. jQuery

Interpreters, Compilers, and Learning

[Disclaimer: I know very little about computers and operating systems at this point, as I just started going back to college for my second BS, this time in CS. However, with my background in neuroscience, I can’t help but try to find parallels between what I already know about the brain and the things I’m learning about computers. I realize that the worn analogy of brains and computers doesn’t always hold weight, but as I try to understand the new things I’m learning, I’m going to refer back to things I already know, which is the brain.

As I learn more, I’ll probably update articles. If you have any insight into anything I’ve written, please share with me, as I and my readers love to learn!]


Computers can only execute programs that have been written in low-level languages. However, low-level languages are more difficult to write and take more time. Therefore, people tend to write computer programs in high-level languages, which then must be translated by the computer into low-level languages before the program can be run.

Now, there are two kinds of programs that can process high-level languages into low-level languages: interpreters and compilers.

Interpreters read the high-level program and then executes. It does so by reading the code one line at time, executing between lines. Hence the term INTER (between) in INTERpreter.

Compilers reads the programs and translates it completely before the program runs. That is, the compilers translates the program whole, and then runs it. This is unlike the interpreter’s one-line-at-a-time method.

These aspects of programming got me thinking a bit.

Compilers remind me automatic processes, like when we are operating on auto-pilot. Our brain is still taking in information, but it’s not processing it one bit at a time; it’s more big-picture, and less into the details at a given moment.

However, when we are learning something new, our brains are more focused on the details and more interested in processing things in bits, and then “running” it. That is, when we are struggling with new information, or just ingesting new information, our brains are more apt to take it bits of information at a time, processing it, and then moving on to the new piece of information. That way, if something is not understood, it’s realized early on in the process, and that can be remedied.

Findings Friday: The aging brain is a distracted brain

As the brain ages, it becomes more difficult for it to shut out irrelevant stimuli—that is, it becomes more easily distracted. Sitting in a restaurant, having a conversation with your table partner right across the table from you, presents as a new challenge when the restaurant is buzzing with activity.

However, the aging brain does not have to be the distracted brain. Training the mind to shut out irrelevant stimuli is possible, even for the older brain.

Brown University scientists conducted a study involving seniors and college-age students. The experiment was a visual one.

Participants were presented with a letter and number sequence, and asked to report only the numbers, while simultaneously disregarding a series of dots. The dots sometimes moved randomly, and at other times, moved in a clear path. The latter scenario makes the dots more difficult to ignore, as the eye tends to want to watch the dots move.

The senior participants tended to unintentionally learn the dot motion patterns, which was determined when they were asked to describe which way the dots were moving. The college age participants were better able to ignore the dots, and focus on the task at hand (the numbers).

Another study also examined aging and distractibility, or an inability to maintain proper focus on a goal due to attention to irrelevant stimuli. Here, aging brains were trained to be more focused. The researchers used older rats, as well as older humans. Three different sound were played during the experiment, with a target tone presented. Awards were given when the target tone was identified and the other tones ignored. As subjects improved, the tasks became challenging, with the target tone becoming less distinguishable to from the other tones.

However, after training, both the rats and the humans made fewer errors. In fact, electrophysiological brain recordings indicated that neural responses to the non-target, or distracting, tones were decreased.

Interestingly, the researchers indicated that ignoring a task is not the flip side of focusing on a task. Older brains can be just as efficient at focusing as younger brains. The issue in aging brains, however, lies in being able to filter out distractions. This is where training comes in: strengthening the brain’s ability to ignore distractors; not necessarily enhancing the brain’s ability to focus.

The major highlights of the study include training older humans with respect to enhanced aspects of cognitive control, and the adaptive distractor training that sought to selectively suppress distractor responses.

Technique Thursday: Microiontophoresis

In the brain, communication is both electrical and chemical. An electric impulse (action potential) propagates down an axon, and chemicals (neurotransmitters) are released. In neuroscientific research, the application of a neurotransmitter to a specific region may be necessary. One way to do this is via microelectrophoretic techniques like microiontophoresis. With this method, neurotransmitter can be administered to a living cell, and the consequent reactions recorded and studied.

Microiontophoresis takes advantage of the fact that ions flow in an electric field. Basically, during the method, current is passed through a micropipette tip and a solute is delivered to a desired location. A key advantage to this localized delivery and application is that very specific behaviors can be studied within context of location. However, a limitation is that the precise concentration of solute may sometimes be difficult to determine or control.

The main component of this technique is the use of electrical current to stimulate the release of solvent. In other words, it is an “injection without a needle.” The main driver of this electric current is galvanic current. The current is typically applied continuously. The solute needs to be ionic, and must be placed under an electrode of the same charge. That is, positively charged ions must be placed under the positive electrode (anode), and negatively charged ions must be placed under the negative electrode (cathode). This way, the anode will repel the positively charged ion into the skin, and the cathode will repel a negatively charged ion into the skin.

Manic Monday: Eye of the Storm

A classic experiment on discrimination was Jane Elliott’s Blue eyes/brown eyes experiment. Jane Elliott is a former third-grade teacher, with no research background to speak for. However, the day after Martin Luther King Jr. was shot, she decided to try a little experiment with her young, impressionable students.

What she did next was nothing short of fascinating.

On April 4, 1968, Jane Elliott was ironing a teepee for one of her classroom activities. On the television, she was watching news about the assassination of King. One white reported mentioned something that shocked Elliott:

“When our leader [John F. Kennedy] was killed several years ago, his widow held us together. Who’s going to control your people?”

Elliott could not believe that the white reported felt that Kennedy was a white-person leader, and that black people would now get out of control.

So she decided to twist her little Native American classroom exercise and replace teepees and moccasins with blue-eyed and brown-eyed students.

So, on the first day of her experiment, Elliott decided that since she had blue eyes and was the teacher, blue-eyed students were superior. The blue-eyed and the brown-eyed children were consequently separated based on something as superficial as their eye color.

Blue-eyed children were given brown collars to wrap around their brown-eyed peers. All the best to notice them with.

The blue-eyed children were then given extra helpings of food at lunchtime, five extra minutes at recess, chance to play at the new jungle gym at school. The brown-eyed children were left out of these activities. The blue-eyed children were also allowed to sit at the front of the class, while brown-eyed children were kept at the back.

Blue-eyed children were encouraged to play with other blue-eyeds, but told to ignore brown-eyed peers. Further, blue-eyed students were allowed to drink at the water fountain, while the brown-eyed ones were prohibited from doing so. If they forgot, they were chastised.

Now, of course the children resisted the idea that the blue-eyed students were superior somehow. Elliott countered eloquently, and with a lie: melanin is linked to blue eyes, as well as intelligence.

The students’ initial resistance wore out.

The blue-eyed “superior” students then became arrogant and bossy. They were mean, and excluded their brown-eyed peers. They thought themselves superior, simply on the basis of their eye color.

What’s even more interesting is that the blue-eyed students did better on some of their exams, and performed at higher ability on math and reading than they had previously. Just believing they were superior affected their grades positively.

Even more interesting, but perhaps not surprising, was what happened to the brown-eyed students:

They became shy, timid, and frighteningly, subservient. They did poorer on their tests, and during recess, kept themselves away from the blue-eyed children. Each group effectually grouped themselves according to their eye color.

The next week, Elliott added another twist to the experiment: she made the blue-eyed students inferior, and made the brown-eyed ones superior. Brown collars for the blue-eyeds now.

The brown-eyeds then began to act meanly towards the blue-eyed kids, though at a lesser intensity.

Several days later, the blue-eyed students were told they could remove their brown collars. She then had the students reflect on the experiment by writing down what they thought and had learned from the experiment.

Needless to say, the experiment had a major impact on her students. Elliott continued the experiment with her students for years after, and has appeared on Oprah and other venues, promoting anti-discrimination.

A documentary was filmed about her experiment, called Eye of the Storm.

A beautiful video about a modern re-enactment of the experiment can be found here.