Alright, so I'm back at it. Trying to learn the fundamentals of CS, and what better institution to learn from than Harvard! I'm currently writing this as my CS50 IDE is being updated. Before I get started with my first problem set, I thought I'd take the time to share the topics that were discussed in the first week of CS50 and what I learned. After every lecture/week I'll give bullet points of the topics discussed and share my favorite part or something cool I learned.
- source code (syntax that's easily digestible for us that we ultimately convert to machine code) -> (compiling) -> machine code (binary - discussed in Week 0)
- the command line
- I skipped week 0 (I know, shame on me), but this lecture also refers back to the previous lecture to make comparisons between C and Scratch.
- the CS50 IDE (Integrated Development Environment)
My favorite part:
Learning about overflow and imprecision (basically the downfalls of having limited memory /bits). Data types have a finite number of bits, and this can limit a program. For example, if you want to see the full expression of a float, it may not be able to be displayed precisely because there are not an infinite amount of patterns of zeros and ones to represent all the possible real (and long) numbers. That's imprecision. When dealing with overflow in numbers, once the finite amount of bits have been surpassed, there's the possibility of that number incorrectly defaulting to a negative, positive or zero integer - which sounds minor, but can lead to major consequences. David, the professor teaching the course, mentions how the problem of overflow negatively impacted a Boeing airplane! Numbers are powerful, y'all!