Press J to jump to the feed. Press question mark to learn the rest of the keyboard shortcuts

Join the discussion

Become a redditor
507

Frustrated with half-assed knowledge

I don't even know how to phrase this, but I'll try. I know Python, Java, C, Ruby, so it isn't about languages. I am just constantly bothered about not really understanding how the stuff I wrote on my IDE became 0's and 1's and how that produced - in the context of registers and memory as well as silicon chips - what I asked the computer to produce. All I know about registers is that registers exist for CPU to process current data, all I know about the OS is its workflow with stacks, and other things they told us in CS classes. I don't have an intuitive understanding of how servers translate from electronic to systematic, how the stuff I'm submitting to my department's server really reaches there. I want to know where to start knowing. These do get in the way of coding, because when I'm doing memory allocation in C I really feel the need to imagine the bigger picture of memory inside the computer.

Edit: For those who feel this is a big undertaking, I am not looking for too nuanced details. Just an overall idea of the big picture, enough to trace what happens to code on the way to memory and back. Any recommendations?

117 comments
97% Upvoted
What are your thoughts? Log in or Sign uplog insign up
level 1
289 points · 8 days ago · edited 7 days ago

It sounds like you want to read the famous dragon book and dinosaur book. Might as well have a read through the wizard book to finish the set. Nand2tetris also comes to mind.

I will also say that you don't need to know any of this unless it's stopping you from doing something you want to do. It's a fundamental skill of a programmer to abstract away unnecessary details, if I write a query to a server and I follow the API's specs, why do I care if it's routed through a local network, sent through the internet, flown by a carrier pigeon or conveyed by a smoke signal?

EDIT: Let me reiterate

you don't need to know

If you want to know, then go learn it. I never said that you shouldn't know these things.

unless it's stopping you from doing something you want to do.

If you actually have need for it, then that's when you should learn it. I had to optimise my C++ program, so I learned about C++ compiler optimisation options, copy avoidance, cache optimisation and concurrent programming. I didn't go learn all about lexers, parsers, abstract syntax trees, grammars, assembly, processor specific instruction sets, OS thread management and a hundred other topics while my code sat there unoptimised.

We only have finite time and resource, "learn everything you can about everything" is an absurdly unrealistic piece of advice. OP is struggling to develop a mental model for allocating memory in C, learning how to build tetris from NAND level up is NOT an efficient way to accomplish this task.

level 2
122 points · 8 days ago

Because apstractions are leaky and if anything goes wrong with your carrier pigeon you need to know how it happened. He was loyal to you for years and the same thing might happen again.

level 3
40 points · 7 days ago

F

level 4

F

level 5
2 points · 7 days ago

F

level 5

There was an attempt.

level 6

G

level 7

M

level 8

General Motors?

level 3
13 points · 7 days ago

I think this point is important. To many software engineers or coders don’t go the extra step of understanding the protocols, the servers, the architecture. They don’t bother to understand how load balancing works, how LDAP works, they just write their functions in their happy little test spaces and then when something goes wrong they are surprised. They don’t get that the code they are writing is part of an ecosystem and each part of that ecosystem needs to be healthy. From a sysops view point these are the coders you avoid, and usually the ones to take the time to get it not only make the best coders they make the best sysops folks.

level 4

Correct. I'm finding it more and more when I apply for positions that someone who has knowledge across the spectrum (support, sys admin, networking, development (sql / code)) is more desirable. A lot of graduates are brought into developer roles where they strictly script code and test within dev environments, but have little knowledge of much else outside of that bubble. Some are not even gaining experience with IIS configurations, server requirements, and keystores.

level 5

I don't see the problem in specializing In a specific area of a broader field. It's unreasonable to expect everyone to have studied every aspect of a development environment straight out of school. I think its fine to learn one area and become very good at what you do and not really worry about learning stuff that honestly should be someone elses job. We're humans not machines. Maybe I'm an idiot though...

level 2
[deleted]
53 points · 8 days ago

Is curiosity a big enough reason?

level 3
59 points · 8 days ago

It's good to be curios, better to be practical. There's more knowledge in computer science (or any field really) for anyone to learn in a single lifetime. It's impractical to learn how and IDE is transforming code to 0's and 1's before you've successfully written a bubble sort. Similarly there's no point in delving into OS theory before understanding classes and objects.


If OP had an abundance of free time then by all means delve into these lower level topics all they like. But otherwise their time is better spent keeping to a relevant level of abstraction. I mean imagine if someone just used `pip` to install a module and instead of learning how to use that module they wanted to figure out how `pip` managed dependencies, how it communicated with repositories and how it built packages. It obviously wouldn't hurt to know, but it wouldn't really change they way they used pip and it wouldn't help them with the module they just installed.

level 4

This!!

I don't think this could've been said any better... in fact, I often fall into this trap, but lately I've learned to stay on track.. this is more a matter of focus. You need to discard all the irrelevant information and focus on the task at hand.

level 5

Practical skills can make you a good programmer, but those details in those “don’t need to know” areas can set you apart...

I have been submersed in programming since early high school, started learning electrical basics in the later grades, and have a degree in Mechatronic Systems Engineering which gave me an understanding of electrical, mechanical, and software engineering. I also took real-time operating systems as an elective, and a number of business courses.

In most jobs I’ve had I’ve worked with developers that know practical skills, they’ve come from CS backgrounds and can be good developers, get things built quickly, and usually to a good quality. But there have been quite a few times where we’ve had more advanced tasks, involving more advanced math, non-trivial multithreading scenarios, and features that looked large and difficult to build. This is where the extra knowledge I’ve acquired has been able to help in either math, understanding OS architecture and how it works, or using abstract concepts from other areas like electrical or mechanical design to provide alternative solutions.

Because of this I’m highly valued, well paid. If you stick to your job title, learn only the practical skills, you will do well. Learning the non-practical skills, if you have the time for it, and find a way to make use of them can make you an asset.

level 6
1 point · 7 days ago

If there was knowledge that was necessary for building features they were asked to build or enhance features they were asked to build, then by definition this is practical knowledge and skill.

Practical is defined as

of or concerned with the actual doing or use of something rather than with theory and ideas.

An example of knowledge that would not be practical is knowing the circuit diagram for a RAM module when you are working on numerical software for weather prediction.

level 5

I have found it tough sometimes to be able to differentiate between relevant and irrelevant information. I often find myself missing the main point of lessons and digging too deep to learn why things are happening the way they are.

What has helped recently, if I don't understand a topic, is utilizing multiple specific resources instead of digging into somethings background.

level 2

Oh man we had to read that OS Concepts book for my OS course. It was painful but really interesting.

level 2

There’s a concept called “mechanical sympathy” that talks about how a having a certain amount of lower level understanding can help in higher level tasks.

level 2
7 points · 7 days ago

As my four-year-old would say when she’s getting a shot, “No thank you!“

As a system engineer I don’t want to support your application. This type of thinking often results in shortsightedness that causes me headaches.

Classic example: Otherwise well written performant application works great until the firewall has a failure and switches over to the standby. TCP/IP connection is lost. The app was written without considering that the network isn’t 100% reliable and never recovers.

If you’re an agile shop maybe this isn’t a big deal and you fix it in the next few sprints as long as it’s not blown off as a rare issue. . However in a waterfall shop this kind of problem can last for months or even years.

A dev that has a more holistic view is more likely to be able to spot potential issues like this in advance, saving me headaches.

level 3
Original Poster1 point · 7 days ago

Agree. I can also tell this is significant by the improvement in my programming skills as I learned more and more about the overall workings of computer systems.

level 2

Becausr if your signal doesnt do what its supposed to do, you need to be able to figure out why. Abstraction is part of the toolkit, but fundamental stuff is the basis

level 3

Yeah, you have to play through it further. If you reach a problem and say "why do I care about that?" you're looking for a new job.

level 2
5 points · 7 days ago

if I write a query to a server and I follow the API's specs, why do I care if it's routed through a local network, sent through the internet, flown by a carrier pigeon or conveyed by a smoke signal?

I've seen some stories on /r/talesfromtechsupport that give cases where you might care. The one that comes to mind was a business-critical triply-redundant system taken down by one network card fault. Because the three redundant machines were virtual machines, run on the same physical hardware. Sometimes, moron-proofing requires being able to peek through the veil.

level 2
3 points · 7 days ago

The dinosaur book seems to have later editions which one is suggested?

level 2
2 points · 7 days ago

I think there is some merit to learning the underlying systems at work unless you always work at a very high level. I'm taking this out of the golang playbook but, computers are not keeping up with the speed at which we are developing more complex things. Eventually we will have to tackle the issue of making code more efficient.

Fortunately this is more at the design stage of a language, but I can't imagine it's bad to know these things.

level 2

The fact that you wrote a garbage driver suddenly isn't an unnecessary detail when it garbles all the api calls I make

level 1
40 points · 7 days ago

The book "Code" by Charles Petzold changed my life in how I understand computers and how they work. He starts from the ground up (literally) in explaining computer language. I recommend it for beginners and experienced users/programmers.

level 2

Came here looking for this

level 2

I can't recommend this book enough

level 2

I can see it was published in 1999/2000 how relevant is this now 20 years later?

level 3

Physics doesn't really change much so it still works today.

level 4
4 points · 7 days ago

when is Newton dropping Gravity 2.0

level 4

20th century was quite rough for physics tbh.

level 5

Definitely the 20th had a lot of change, I have to agree with you on that.

level 3
5 points · 7 days ago

Its very, very relevant. Its builds from almost nothing to understanding a full functioning computer system. And it's surprisingly simple.

level 3

Ah yes, '99. I remember looking on in awe at the freshly minted locomotive travel machine. Such a contrivance seemed odd to most folk; what use could a metal horse wagon be and how does it move? But sure enough, those thinker-Trevithick-tinkerers had already started laying down track and the future truly seemed thrust upon us. It was a simpler time for a simpler people but I'll be damned if we didn't have our roots. I'll give it to that Monroe boy though; the fella knew how to purchase a Louisiana. Bit pricey at 4 cents an acre though. A man could go a long ways with 4 cents back then. The dotcom bubble put an end to that though.

level 3
2 points · 7 days ago

This information is as relevant today as it was when it was written. Considering the concepts discussed in the book date back to the 19th century, I would say the content will hold it's relevance for quite some time.

level 1

Check out Nand2Tetris - it pretty much explains everything from the logic gate on up to a functioning system. I took a course last year based on this and found it to be quite fascinating and eye-opening.

The original site is here: https://www.nand2tetris.org/

You can take a course on Coursera based on the site as well: Build a Modern Computer from First Principles

level 2

Just adding this link to a game based on the first part, nand to computer: http://nandgame.com/

level 2
[deleted]
5 points · 8 days ago

Not OP but many thanks. Have registered.

level 2

The coursera courses (it's in two parts) for this are great. 100% recommend. Also the nice thing about doing it through coursera is that it helps you keep to a schedule, and there are forums where it's easy to find answers to common issues.

level 3
2 points · 7 days ago

I'm doing the first course atm.

I just finished building the RAM and program counter and now I'm working on the machine language.

It's really cool.

level 1
11 points · 8 days ago

Check out this book Crafting Interpreters

It gives you a detailed walk-through of general compilation and interpretation techniques by actually building two interpreters in Java and C.

The author explains each and every line of code without throwing any complicated jargon at you. If there are any jargon involved, usually they are explained early on in the book.

I am at chapter 4 of this book and am really enjoying building a interpreter from scratch.

level 1

The best I can tell you is this. Your frustration and your curious mind are good indicators that you will become a great subject matter specialist in programming. Your impatience, however is not such a good indicator.

All great subject matter specialists I have known, became successful because they developed the art of balancing the micro and the macro understanding of what they have mastered. Rarely do most people have a need to know the details of a subject matter in its micro operations. [ There are exceptions to the rule, always. Yours may be a mind like that of Einstein. In which case your discomfort and frustration is justified. ]

I will never under estimate the advice I received in the following words: "Easy does it."


level 2

This.

I was also curious and impatient and finally found out that I really need to restrict my interest in a very tiny area in one subject to get something meaningful done in my life. I still can't do it though. It's kind of difficult for minds like ours.

level 1

You could dick around with Compiler Explorer a bit. It's a web tool that allows you to type code and immediately see how it translates to assembly. Maybe it is one additional piece to your puzzle.

By the way, I appreciate your effort of trying to understand the entire stack. It will make you a true professional. 👍

Just don't overwhelm yourself. One step at a time. 🙂

level 2

My god

Thank you for this, it EVEN HAS ARM assembly yes !

level 2
Original Poster1 point · 7 days ago

How wholesome. Thanks person.

level 3

Just going to jump on u/jones_supa's recommendation and say that you should give assembly a go. Maybe start out with NASM assembler and make some simple programs. You basically have to know how memory works to use it correctly, and it seems to be a more straightforward path than learning how compilers work.

level 1
10 points · 7 days ago

"Computer Systems: A Programmer's Perspective" has everything you need (including networks). Really, really good book.

level 1

knowledge is endless
the more you learn the more you realise you dont know shit

level 2

At least you know that there is less shit that you know now than before when you didn’t know more shit.

level 2

This is true, but I didn't have to learn much to realize I don't know shit.

level 1

I recommend watching Computerphile on YouTube. They interview university lecturers and such to have them explain a lot of the fundamentals of computer science in an approachable way. Some of it covers high level stuff like hashing algorithms and some of it covers lower level stuff like building logic gates out of dominos. I’ve found it accessible and very handy for filling in gaps in my computing general knowledge. eg. in an interview at Microsoft I was asked, “How does the internet work?” and I was pleased I’d watched the episode on routing as it belped me givea better end to end answer.

That series gives you good ideas about what it’s interesting to know and you might find it useful as a jumping off point to research some of the concepts in more depth.

level 1
14 points · 8 days ago · edited 8 days ago

Learn how processors work: Digital Design And Computer Architecture

Learn how networking works: An Introduction to Computer Networks

Learn more about Computer Systems: Computer Systems - a Programmers Perspective

Learn about Operating Systems : Modern Operating Systems

level 1

I have a Computer Engineering degree and they teached us exactly what you want to know.

Honestly, it is alot of understanding because to know how memory works, you need to know how the CPU interact with it and therefor, how the CPU works.

If you want to know how your data goes through a wire to reach your server, you need to know about networking and how a packet works. Once you understand that, you need to know how a network works, the different protocol and the different approach on how the computers talk together. And maybe how the internet works as a whole thing.

There is no real shortcut since everything is connected.

level 2
3 points · 7 days ago · edited 5 days ago

Eh, cmon. You and I both know things can be abstracted away. We had to take some computer architecture semesters and some electronics classes. We learned stuff without having to know everything.

Start on how c++ reserves memory for itself and then look at how fail proof SR latches are made. She’ll get the gist of what’s going on without having to know everything.

Networking is another beast, but he can get the gist of it by googling network stacks and what layer is associated with what protocol and hardware.

Edit: gender correction

level 3
Original Poster1 point · 5 days ago

I'm a she, btw

level 4
2 points · 5 days ago

Oof good catch. Too many guys in EE.

level 1

Start here:

[What Every Computer Scientist Should Know About Floating-Point Arithmetic] (https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html)

Follow up here:

What Every Programmer Should Know About Memory [PDF]

These won't answer all of your concerns but they are directly useful for programming and you said you wanted to know where to start. This is a good starting point that's not an entire book or several as suggested by others. There's no need to learn all of everything overnight.

level 1
6 points · 8 days ago

I think that while it's nice to have a deeper understanding of how a computer works -- it's actually not at all required to become a brilliant programmer. Understanding how data flows, how many times data is read, how long something will take, and what decisions to make based on expected data types is FAR more important than understanding how C gets turned into binary. While raw C does deal heavily in pointers, modern C++ doesn't really (see: shouldn't) deal in pointers very often. Languages like Java, Python, and Ruby (not sure -- but I think. I've not worked in tremendous depth with it) abstract that sort of stuff away from you completely due to how the languages are architected.

There's a really quick and easy explainer video which you can start with here: https://youtu.be/QXjU9qTsYCc If you want to keep building beyond that and have a deep functional knowledge, the dragon book (mentioned below) is sort of the defacto reference for this sort of stuff.

level 1
3 points · 7 days ago

Computer Science Crash Course did it for me. Fantastic explanations. Made by the Raspberry Pi Foundation.

https://www.youtube.com/playlist?list=PL8dPuuaLjXtNlUrzyH5r6jN9ulIgZBpdo

level 2

Was going to recommend this myself. Give this a watch, OP. Really comprehensive and well-presented.

level 1

Do you understand binary and binary addition? Take a look at a 4bit adder, you can even look at Minecraft examples of them to see how silicon logic gates come together to add numbers.

Assembly really translates to binary instructions that tell the processor which path to take for each instruction. It tells it which registers to grab, and whether to send them through the adder, the multiplier etc. In a way it's like routing trains cars that carry data through the processor.

So looking at the 4 bit adder shows how numbers are manipulated, and the train metaphor roughly shows how manipulation operations are selected. How are they stored? Well when they come out of the adder/multiplier unit the assembly is predefined to put them into a specific register. Registers store binary using logic gates formed into latches. Again here you can look at Minecraft examples of latches like the s/r latch.

In many assembly sets there are only 16 possible instructions. Stemming from a desire to have the instruction represented by the fewest number of bits (in this case 4 bits) as well as being able to accomplish pretty much anything through combinations of the instruction set.

If you have questions feel free to shoot me some follow ups, or look into some EE classes at your college. The digital logic series is what you want to look into. Go meet the professor before hand, drop by their office, any professor worth their salt will chat with you even if you aren't in their class.

Books are great, but there is no substitution for learning from someone who wants to share passion for the subject.

level 1

I can give you a few topics of research and some general philosophical guidance if you would like.

For bare metal read up on semi-conductors and their role in logic gates.

For very low level computation take a look at the history of the CPU and in particular the transition from punch cards to x86. This will tie in the logic gates to assembly language.

For low level languages - Yacc and Lex and how a language like C is parsed into tokens and compiled into assembly. You may even want to write a simple language of your own.

For basic operating systems start with the four managers; memory, processes, file and I/O. The process manager is pretty interesting in the way it time slices a task.

For networking the holy grail is the 7 layer OSI model. It shows the relationship between bare metal pulses sent through a wire through to HTTP. A more practical topic is to look at RESTful services in layer 7.

Okay, now for the philosophical stuff. You could spend years on any one of those topics and barely scratch the surface and it won't make a piss of difference to your programming skills. It's interesting, sure but it has nothing to do with programming. Ultimately, programming is all about abstractions and we build useful applications and tools by sitting atop a pillar of prior work that allows us to concentrate on the task at hand. Our job is difficult enough without breaking down those carefully constructed abstraction and it's extraordinarily rare that we ever need to.

The frustration I saw in myself many years ago was the same as yours and I realise now it was misdirected. What I was really frustrated about was not experiencing those things coming together to make something meaningful. The path to that is working up those layers of abstractions, not down. Picking a high level framework or two and finishing a modern server-client architecture application.

level 1
3 points · 7 days ago

I had the same questions in university. Therefore, I switched my major to CE. Now I know how it works and how it’s made!

That being said, I forgot how to use GitHub nor did I ever learn the difference between PHP and SQL.

Just dive into things you’re interested in and you’ll be happy. You don’t need to know everything.

level 1
4 points · 8 days ago

Knowing all of that is a huge undertaking. But if it's truly interesting you, I'd start reading about computer engineering. Start with the idea of transistors, what logic gates are and how they're built out of transistors, and move towards how rudimentary processors and memory work.

If you're just trying to get an idea of how it all works in order to better "get" what programming is, then reading about it is probably enough.

level 2
Original Poster3 points · 8 days ago

I was not looking for too nuanced details, just an overall idea of the big picture, enough to trace my code to memory and back. Any recommendations?

level 3
4 points · 8 days ago

Compiler design maybe? That and assembly language would get you to basically the step where it’s converted to machine language. Again like people have said though it’s not super necessary knowledge to have. Some of it is pretty interesting though

level 1
2 points · 7 days ago

When you drive your car, do you feel like you must understand at the molecular level the intricacies of the combustion engine in order to be a good driver?

level 2

Programmers are not analogous to car drivers, they're more like manufacturers and mechanics. The user would be a driver, they don't need to know the intricacies. But programmers should on SOME level, not necessarily molecular.

level 3
Original Poster1 point · 7 days ago

Concur.

level 3
1 point · 7 days ago

The metaphor still stands and arguing specifics over an obviously simplified metaphor is the exact brand of thinking that caused the OP.

But hey what do I know I've only been programming as a senior for about 20 years now.

level 1
2 points · 7 days ago

Go to school for computer EGR or look for classes online about computer EGR/ ELE. Im an ELE student and this is what I been learning for the past 2 years.

level 1

Crash couse on youtube has a computer science playlist that doesnt go into coding at all, it just goes into the basics of..... Exactly what you said. How computers work and communicate without super technical details.

They start from the beginnings of modern computers and work forwards so you can understand why things work the way they do, and it helps understand why we have the layers of abstraction we have.

level 1
2 points · 7 days ago

Are you still in a CS program? Have you taken classes like computer architecture, operating systems, and compilers? Because those 3 gave me a really solid foundation to understand exactly what you are describing.

level 2
Original Poster1 point · 7 days ago

I'm an SE major, I plan on taking these as electives though, after I finish my scheduled 2nd year classes.

level 1
2 points · 7 days ago

I'm not sure if this is really what you're looking for, but this series by Ben Eater where he builds an entire computer on a breadboard is really amazing. He's great at explaining things in an easy to follow way and it helps give you an idea of how a computer actually works all the way down to the logic gates.

level 1
2 points · 7 days ago

Imho, while most people here are advising books (some of which I DEFINITELY recommend anyway) if you want a quick walkthrough I think the best option is simply... any university.

Just google a bit, find a Computer Science course’s website (for example, Cambridge’s Algorithm Design course material can be found legally online) and read that stuff.

It’s way shorter, if the slides are at least decent you get to understand the subject, it’s free and you end up having a general yet fairly solid idea of pretty much everything you need

level 1
2 points · 7 days ago

Imho, while most people here are advising books (some of which I DEFINITELY recommend anyway) if you want a quick walkthrough I think the best option is simply... any university.

Just google a bit, find a Computer Science course’s website (for example, Cambridge’s Algorithm Design course material can be found legally online) and read that stuff.

It’s way shorter, if the slides are at least decent you get to understand the subject, it’s free and you end up having a general yet fairly solid idea of pretty much everything you need

level 1

You should look into a book called Inviting Computer Science. It's easy to follow and walks through how the 0's and 1's work with hardware and how that translates to software.

level 1

i know i'm late to the party, but it sounds like you want to learn networking

level 1

The book “computer organization and design” by David A Patterson has good info on this. It’s the textbook we use for my computer architecture class. It’s a pretty easy read and it’s cheap

level 2
Original Poster1 point · 7 days ago

Thanks, I'll check it out

level 1
2 points · 7 days ago

See if "Inside The Machine" by Jon Stokes help. It is an old book, so you might be able to borrow from a library or get it for cheap from a used book store.

If you want to know more details there are a few books on computer architecture like Patterson and Hennesy's books and Silberschatz or Tennenbaum's OS books

level 1

I'll try to give you a high-level answer without writing a mini-book.

Let's say you have a processor - you could write your program in binary as a sequence of instructions, and the processor would start reading them from memory and executing them. Each instruction is expressed in binary, something like 0011001110011 (usually longer than this, e.g. 32 bits or more). If you know the instructions of your processor, you know that the first X bits identify the instruction (e.g., move, add, etc.), and the rest might be used for parameters/data - so your processor can identify what to do for each binary instruction.

Now, writing binary is not fun, so you can write in assembly language. Much like binary, this language changes depending on the processor you're using (although a lot of the instructions are similar) but the main idea is that instead of writing binary, you now write "text". So now you can write something like "MOV R1, R2" instead of 0011100001111. In order for this to work, you need to use a program called assembler, which translates the text in your program into binary instructions.

Writing assembly becomes tedious for larger programs, not to mention the fact that it's CPU-specific (you'd have to change the code to make it run on a different CPU), so you write in a "high-level language" like C instead. As you know, you then use a program called compiler to process your C code - the compiler turns C into assembly, and then calls the assembler to turn that into binary. You can easily see this in action by asking your compiler to output assembly instead of binary. Notice that the translation from C to assembly is less "direct" than the one from assembly to binary (i.e., there are many more possible translations from C to assembly than there are from assembly to binary), so your compiler will try to write "optimal" code. The compiler also translates the C code into assembly for whatever CPU you have, and this is why C can be used on different CPUs without changing the code (hence the name "high level language").

There are also other languages that get compiled to C, or run on a "virtual machine" written in C or C++ (e.g., Ruby, Java, Python, etc.), but that's just an extra step, the fundamental process doesn't change.

So now you can write in C, but you probably want to use other components in your computer, e.g., you might want to read from the hard drive, or send data to your monitor to be displayed. You might also want to run more than one program in parallel, handle logins/passwords for the users on your computer, and so on. These (and many more) functionalities are bundled up in a program called "operating system" - when it boots up, your computer runs this program, so that any other programs can use the functionalities it provides.

The functionalities offered by the operating system can be accessed by using "system calls" - these are C functions you can call from your own C code, and they allow you to read and write from file, run two programs in parallel, and much more. You often don't call them directly, because your language has a standard library that calls them behind the scenes (so when you call "readFile", the code in readFile is actually calling the system call to ask the operating system to open a file and read from it).

In terms of networking (e.g., talking to a server) there are some system calls that allow you to use your network card / hardware to send and receive (binary) data from the network. The communication is implemented through a layer of protocols (physical, IP, TCP, etc.) that basically describe how two or more computers can exchange binary data by sending it and receiving it on the network (cables, radio waves, going through other computers before reaching the destination, etc.). So when you call a library (or framework) to send data to another computer, behind the scene your code ends up making a system call, and your data gets broken into binary packets to be sent on the network according to the networking protocols.

I've skipped a lot of the details, but this is the general process - please ask me more questions about the parts that I haven't explained well :-)

level 2
Original Poster2 points · 6 days ago

Thanks a lot!! Giving you gold in my mind.

level 1
3 points · 8 days ago · edited 8 days ago

I'm going to give you advice that I suspect answers you many questions and doesn't take long. Learn 6502 from this page https://skilldrick.github.io/easy6502/ The 6502 is what old consoles ran on After you assemble you can click the disassemble or hexdump button to see the difference between `lda #$01` and `lda $01`. (from wikipedia Atari 2600, Atari 8-bit family, Apple II, Nintendo Entertainment System, Commodore 64, Atari Lynx, BBC Micro and others ). The colors in the black box next to the code is affected by bytes in memory at $200-$5FF. Good luck

level 1
2 points · 7 days ago

Thank you so much for putting this into words! I'm just starting my journey and I'm very frustrated because my books taught me how to use classes and objects, but I genuinely don't understand how they actually work. Any recommendations for this specific issue? I'm trying to learn Python for data science

level 1

It's impossible for you to know every nook and cranny of the OS, hardware and whatever software you are learning. One person didn't invent it, it took hundreds of thousands over decades.

One of the most important lesson for an engineer is to understand this limitation and look for a solution. Look up 'Black Box' - this will tell you how to approach your frustration. Even though you won't ever know the inner workings of your CPU etc (intel/arm/amd these are proprietary after all and guarded secrets) you can see the inputs and outputs modeled as a 'Black Box' making you free to do what you need. Look at programming the same way and all the API etc that has been invented by other developers. You won't understand them but as long as you can input things to these API and get outputs it should do the job for you. If not then you have to specialize and invent it but that takes time and combined effort most of the time.

level 1

Good old DOS was great for this. I remember spending many hours with the Microsoft C compiler, looking at listings of the object code.

These listings showed the C statement in the context of the assembly statements it compiled into. MS-C produced some remarkably optimized assembler code.

These listings also make it clear what part of the code is "yours" (that is, translated from your statements) and what part are done by calls to library routines, and most importantly in this topic, how parameters are passed. You've probably got a good idea about how stacks are built and used, but when you see how simple the actual assembler code is, you just go "wow".

Most modern languages don't actually 'compile' the same way; they're executed by a runtime module that reads your program like it was data, then processes that data (ie, executes it b/c the data is code in this case). I don't know how easy it would be to get, for instance, python to express itself at the machine level. But if you could do it, it would be an eye-opener.

level 1

How i see this - 64 bit proccessor takes 64 bits from memory and do one of processor operation on it and spit them changed out back to memory. Thats pretty much what your programm do :P

level 1

I think this is the reason most CS or IT courses and degrees use a top-down approach. It enables you to start using working knowledge without worrying about the lifetime of work others have put in for you. There are math formulas that took numerous individuals a lifetime to produce and prove. Imagine doing that instead of just using a calculator or formula.

I had to recently overcome this as well. My curiosity gets the best of me and I understand this stuff on a basic level that satisfies me, but the problem is this hinders your learning process because it's a distraction. You end up diving into engineering, physics, and electronics theory, rather than learning what will make you money.

Watch the CS crash course videos on YouTube. They cover it quick and in enough detail to satisfy you.

level 1

Just YouTube the topics and gain a general knowledge of that what you want, don’t think you really ever need to be an expert on those topics. I mean if you made it through a CS degree you should have taken a digital design course, and assembly language course, and a OS design course as a requirement, even after those classes I can’t say I am super informed on how it all works at the lower levels, that’s kinda the job for the computer engineers to worry about imo.

level 1
level 1

Another vote for NAND2Tetris.

level 1

As for malloc, you could write your own. Google a tutorial.

level 1

Wow, this is a huge problem that not everyone coder (including me) seems to be aware of. I just ordered the famous "dragon book" and signed up for the Nand2Tetris course, which until now I didn't even know I needed! So thanks a lot! ;)

level 2

Problem?

level 1

You need to read books, all the way. Lots of courses kind of hold your hand, but to really get all the knowledge, you gotta pick a book, stick with it, and read it all the way.


That's what's worked for me in the past.

level 1

Thank you! You formulated the question what I wanted to ask for years. Work in IT and I consider myself a average knowledge, but don't have much time with all the information I want to learn. They always say - you cannot know everything. But these are the points that will sure help me.

level 2
Original Poster1 point · 8 days ago

Thank you. I would often search the subreddit looking for something like this until I figured I should stop looking for my confusion articulated here and just write it out myself. I took long because I don't really know where to even start from. It's all muddled in my head.

level 1
1 point · 8 days ago · edited 7 days ago

Here's the thing.

As far as us Soft. Engg. plebs go we are never going to work on making a CPU from scratch. What we can do is we learn from people who do that job and try to fit it in our mental model. But again remember, we are here to provide good maintainable software. If a MOSFET blows up down there, there's no way for us to fix that. Our knowledge is going to be mostly theoretical and that's fine. Being curious is perfectly fine but make sure you get your priorities right. Just pick a field and try to go as deep as you can. As you go along you will see other fields, something that you ignored initially, showing up at several places which may take a lifetime to master.

When I say field make sure it is narrow enough. CS is huge and it will be a massive undertaking to master everything.

And one thing I would suggest is start reading source code written by other people. This can do wonders to your understanding of the bigger picture.

level 1

I've been watching this youtube channel for a while now and I think this video somewhat fulfils what you are looking for. It kinda shows how programming languages "talk" to the pc in a nutshell. It doesn't talk about servers and os though. But I mean it's a short video so no harm wstching amirite? ;)

level 1

'But how do it know' - book and YouTube Crash Course Chanel video series Computer Science 41 great video

level 1
0 points · 7 days ago

There’s an interesting course for free on Lagunita from Stanford on compiler engineering.

level 1
0 points · 7 days ago

I'd recommend having a look at microcontrollers. You could get a good insight to lower level programming, processor architecture and hardware interfacing.

level 1
-4 points · 7 days ago · edited 7 days ago

I know Python, Java, C, Ruby, so it isn't about languages.

I am just constantly bothered about not really understanding how the stuff I wrote on my IDE became 0's and 1's and how that produced what I asked the computer to produce.

These do get in the way of coding, because when I'm doing memory allocation in C I really feel the need to imagine the bigger picture of memory inside the computer.

So then I guess you really don't know those languages, you meerly think you do.

You're overstating your competence, i.e. you're at the start of this graph. You believe that because you've written hello-world in 10 different languages then that means you "know" that language, but it doesn't. You've barely scratched the surface. I would say you can't really "know" a language until you've written a moderately sized program in it AND figured out exactly how that program eventually gets executed by your CPU.

One of the ways of improving your knowledge is by recognising what you don't know. And right now it seems you don't know about any language. So pick one and figure out how it works. I'd offer Java as a good first candidate.

Try figuring out the answers to these questions:

What does javac do? What does java do? What's in a .class file? How do.class files work? What's JITC? Is JITC actually necessary for java? What would AOTC look like in Java -- would you still get a .class file? What's the difference between static and non-static in the class files? How do all of the different languages interact on the JVM?

When you start to hit knowledge boundaries that rely on OS and CPU stuff (e.g. what a static vs dynamic library is, what a calling convent0on is, etc) then you can shift over to asking similar questions about your C build process.

(The graph is the first one I could find on google images and comes from this blog, but not on the one I was thinking of, but it'll do)


edit: A lot of the other comments are on the wrong track. Studying computer engineering is fascinating [my bachelors degree was Comp eng], but won't help you right now. What will help you is

  1. Realising that you don't know what you think you know, as everything you know is based on magic abstractions

  2. Figuring out which of the abstractions it is you don't know and learning about them

Though I do echo the 6502 suggestion!

level 2
Original Poster1 point · 7 days ago

For example, I know/understand the answers to 7 of those questions but the ones I don't understand are big enough reminders of my superficial knowledge. I would say I'm well above the Hello World phase, I understand why I seg faulted on that line of code, or why the print statement returned an address, but far from having a wholesome understanding.

level 3
2 points · 7 days ago

For example, I know/understand the answers to 7 of those questions but the ones I don't understand are big enough reminders of my superficial knowledge.

Which questions do you know/don't know?

I would say I'm well above the Hello World phase,

Could you quantify that? e.g. What's the biggest program you've written? How many files/LOC, and what did it achieve?

I understand why I seg faulted on that line of code, or why the print statement returned an address, but far from having a wholesome understanding.

You may understand that you seg fault on line 23 because my_ptr is pointing outsider of a region returned by malloc, but again that's a very superficial understanding.

e.g. do you actually know what a seg is and what it means to fault? Do you know how malloc actually works? (could you implement your own version? It's entirely possible to run C programs without calling malloc, yet still have dynamic memory usage -- would you know how to do this? Could you actually do it? Like right now?). Do you know on Windows that you don't segfault but that you "Access Violate" -- do you know what that means? Which one is more "accurate" in terms of an X86?

ETC.

The bottom line is: If you're worried you don't know about, a simple way to figure out what you do and don't know is to do the 3-whys (or 5-whys or whatever people say now). If you take a completely random concept that you use everyday in your programming how far can you personally "drill it down" before you run out of steam?

And -- at which point will you be personally satisfied? At the build level? OS level? CPU level? RTL? Gates? Transistor? Electronic/electrical? Turtles?

You need to determine that as well if you really want to help yourself. (And, if you don't know, just pick a higher level and go with it until that's always your stumbling block)

level 4

I know you're kinda getting downvoted but this tough-love advice is really valuable (to a novice like me anyway). It's a bit much to assume OP is only on "Hello World" in the 3rd & 4th languages solely from this post though. Do you have any resources that someone like OP could reference to learn the true fundamentals you're talking about?

level 5
Original Poster2 points · 7 days ago

I concur on how the standards are set (aptly) high for someone to be deserving of saying they "know" a language. And here's to the tough-love advisors.

level 4

Good advice. Legitimate questions for anyone being interviewed for C as well.

Community Details

868k

Subscribers

3.3k

Online

A subreddit for all questions related to programming in any language.

Create Post
r/learnprogramming Rules
1.
No unprofessional/derogatory speech
2.
No spam or tasteless self-promotion
3.
No off-topic posts
4.
Do not ask exact duplicates of FAQ questions
5.
Do not delete posts
6.
No app/website review requests
7.
No rewards
8.
No indirect links
9.
Do not promote illegal or unethical practices
10.
No complete solutions
Asking debugging questions

When posting a question about code, you must include the following:

  1. A concise but descriptive title.
  2. A good description of the problem.
  3. A minimal, easily runnable, and well-formatted program that illustrates your problem.
  4. The output you expected, and what you got instead. If you got an error, include the full error message.

See debugging question guidelines for more info.

Asking conceptual questions

Many conceptual questions have already been asked and answered. Read our FAQ page and search old posts before asking your question. If your question is similar to one in the FAQ, explain how it's different.

See conceptual questions guidelines for more info.