If you want to know how computers work, do electrical engineering. If you want to know how electricity works, do physics. If you want to know how physics works, do mathematics. If you want to know how mathematics works, too bad, best you can do is think about the fact it works in philosophy.
all roads lead to philosophy
Everything is philosophy until it becomes science. Unless it’s anything to do with politics then it just remains philosophy forever.
Science is a subdiscipline of philosophy.
If you want to know how philosophy works, do sociology…
It’s kind of like a horseshoe with philosophy and math at the ends.
A horseshoe capped off by Computer Science 😉
Maybe I’m missing something, but I’d count theoretical computer science as a subfield of math, and practical software engineering among the other engineerings on the harder side of the centre.
I wouldn’t disagree with that. Discrete mathematics was a core subject when I did my Computer Science course.
But I do still laugh when I tell people I’m a ‘scientist’, with my fingers crossed behind my back of course 😉
tbf all good programmers are good at math. Not classic arithmetic necessarily, but at the very least applied calculus. It’s a crime how many people used a mathematical discipline every day, but don’t think they’re “good at math” because of how lazer focused the world is on algebra, geometry and trig as being all that “math” is.
Serious question; how does Calculus apply to programming? I’ve never understood.
PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you’re ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)
Lambda calculus has no relation to calculus calculus, though.
Data science is pure calculus, ground up and injected into your eyeballs
Lol, I like that. I mean, there’s more calculus-y things, but it’s kind of unusual in that you can’t really interpret the non-calculus aspects of a neural net.
Lambda calculus has no relation to calculus calculus
I wanna fight your math teachers. No seriously, what did they tell you calculus is if it’s got nothing in common with lambda calculus?
Is there some connection I’ve just been missing? It’s a pretty straight rewriting system, it seems Newton wouldn’t have had much use for it.
Lot’s of things get called “calculus”. Originally, calculus calculus was “the infinitesimal calculus” IIRC.
I think the issue here might be the overloading of terms - lambda calculus is both the system of notation and the common name for the conceptual underpinnings of computational theory. While there is little to no similarity between the abstracted study of change over a domain and a notational system, the idea of function composition or continuous function theory (or even just computation as a concept) are all closely related with basic concepts from “calculus calculus” like limit theory and integral progression.
edit: clarity
I’m pretty sure the term was coined in the interwar era, so it’s kind of interesting if people are just calling the concept of functions “lambda calculus” now. Obviously they’re much older than that.
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude’s like “what’s a transistor?” ~_~#
I’ve met people like that too.
It’s called cheating, lots of people do it.
Most worthless dev I’ve met was a graduate of comp sci who couldn’t hold a candle compared to a guy that did a dev boot camp.
The best dev I’ve met so far didn’t even have any credentials whatsoever, second next best did 2yr associates.
Tie for 3rd best with associate’s and 4yr degree.
I was partnered with that guy for one class in grad school. We were working on a master’s degree in software engineering, and the assignment was analysis and changes to an actual code base, and this mofo was asking questions and/or blanking on things like what you mention. I can’t remember the specifics but it was some basic building block kind of stuff. Like what’s an array, or what’s a function, or how do we send another number into this function. I think the neurons storing that info got pruned to save me the frustrating memories.
I just remember my internal emotional reaction. It was sort of “are you fucking kidding me” but not in the sense that somebody blew off the assignment, was rude, or was wrong about some basic fact. I have ADHD and years ago I went through some pretty bad periods with that and overall mental & physical health. I know the panic of being asked to turn in an assignment you never knew existed, or being asked about some project at work and just have no idea whatsoever how to respond.
This was none of those. This was “holy shit, this guy has never done anything, how did he even end up here?”
Tbh, as a dev knowledge of transistors is about as essential as knowledge about screws for a car driver.
It’s common knowledge and in general maybe a little shameful to not know, but it’s really not in any way relevant for the task at hand.
Maybe for dev knowledge, but computer science? The science of computers?
Well, computer science is not the science of computers, is it? It’s about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things ;)
It wasn’t named by IT people, though. It was named by academics. And it’s not about using computers, it’s about computing. Computer science is older than digital electronics.
Mhm, and those academics were no IT people and had nothing to do with computers?
Let’s fact-check that.
Computer Sciences as an academic course was first created by IBM at the Columbia University in 1946. Because IBM had made their first commercial computer two years prior and wanted to have people who could operate it and who could continue to develop it.
You are right man
Depends on the context. When my company proposes me to a client for work I am, but oddly during my yearly performance review I am just some smuck who programs.
“Engineer of Information”, please 😎
I mean, nowadays you need to be very smart and educated to google efficiently and avoid all the AI traps, missinformation, stackoverflow mods tripping, reading reddit threads on an issue with half the comments deleted because of the APIcalypse etc… sooo you could argue that you’re somewhat of a scientist yourself
I mean, I am applying various kinds of science but I’m not actually doing any science so I’m not thinking about myself as a scientist. What I do is solving problems - I’m an engineer.
I literally have no idea what this picture means, and at this point I’m too afraid to ask.
The typical holder of a four-year degree from a decent university, whether it’s in “computer science”, “datalogy”, “data science”, or “informatics”, learns about 3-5 programming languages at an introductory level and knows about programs, algorithms, data structures, and software engineering. Degrees usually require a bit of discrete maths too: sets, graphs, groups, and basic number theory. They do not necessarily know about computability theory: models & limits of computation; information theory: thresholds, tolerances, entropy, compression, machine learning; foundations for graphics, parsing, cryptography, or other essentials for the modern desktop.
For a taste of the difference, consider English WP’s take on computability vs my recent rewrite of the esoteric-languages page, computable. Or compare WP’s page on Conway’s law to the nLab page which I wrote on Conway’s law; it’s kind of jaw-dropping that WP has the wrong quote for the law itself and gets the consequences wrong.
I‘d honestly be interested where you are from and how it is in other parts of the world. In my country (or at least at my university), we have to learn most of what you described during our bachelors. For us there is not much focus on programming languages though and more about concepts. If you want to learn programming, you are mostly on your own. The theories we learned are a good base though
I’m most familiar with the now-defunct Oregon University System in the USA. The topics I listed off are all covered under extras that aren’t included in a standard four-year degree; some of them are taught at an honors-only level and others are only available for graduate students. Every class in the core was either teaching a language, applying a language, or discrete maths; and the selections were industry-driven: C, Java, Python, and Haskell were all standard teaching languages, and I also recall courses in x86 assembly, C++, and Scheme.
I have been coding since I was 10 years old. I have a CS degree and have been in professional IT for like 30 years. Started as a developer but I’m primarily hardware and architecture now. I have never ever said I was a computer scientist. That just sounds weird.




