• 1rre@discuss.tchncs.de
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    If you want to know how computers work, do electrical engineering. If you want to know how electricity works, do physics. If you want to know how physics works, do mathematics. If you want to know how mathematics works, too bad, best you can do is think about the fact it works in philosophy.

  • Warl0k3@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    3 months ago

    tbf all good programmers are good at math. Not classic arithmetic necessarily, but at the very least applied calculus. It’s a crime how many people used a mathematical discipline every day, but don’t think they’re “good at math” because of how lazer focused the world is on algebra, geometry and trig as being all that “math” is.

      • Warl0k3@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you’re ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)

        • CanadaPlus@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Lambda calculus has no relation to calculus calculus, though.

          Data science is pure calculus, ground up and injected into your eyeballs

          Lol, I like that. I mean, there’s more calculus-y things, but it’s kind of unusual in that you can’t really interpret the non-calculus aspects of a neural net.

          • Warl0k3@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            3 months ago

            Lambda calculus has no relation to calculus calculus

            I wanna fight your math teachers. No seriously, what did they tell you calculus is if it’s got nothing in common with lambda calculus?

            • CanadaPlus@lemmy.sdf.org
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              3 months ago

              Is there some connection I’ve just been missing? It’s a pretty straight rewriting system, it seems Newton wouldn’t have had much use for it.

              Lot’s of things get called “calculus”. Originally, calculus calculus was “the infinitesimal calculus” IIRC.

              • Warl0k3@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                3 months ago

                I think the issue here might be the overloading of terms - lambda calculus is both the system of notation and the common name for the conceptual underpinnings of computational theory. While there is little to no similarity between the abstracted study of change over a domain and a notational system, the idea of function composition or continuous function theory (or even just computation as a concept) are all closely related with basic concepts from “calculus calculus” like limit theory and integral progression.

                edit: clarity

                • CanadaPlus@lemmy.sdf.org
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  3 months ago

                  I’m pretty sure the term was coined in the interwar era, so it’s kind of interesting if people are just calling the concept of functions “lambda calculus” now. Obviously they’re much older than that.

  • rizzothesmall@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 months ago

    Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.

    Basic programming principles? No clue. Data structures? Nope.

    We were once having a discussion about the limitations of transistors and dude’s like “what’s a transistor?” ~_~#

    • underscores@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      I’ve met people like that too.

      It’s called cheating, lots of people do it.

      Most worthless dev I’ve met was a graduate of comp sci who couldn’t hold a candle compared to a guy that did a dev boot camp.

      The best dev I’ve met so far didn’t even have any credentials whatsoever, second next best did 2yr associates.

      Tie for 3rd best with associate’s and 4yr degree.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      I was partnered with that guy for one class in grad school. We were working on a master’s degree in software engineering, and the assignment was analysis and changes to an actual code base, and this mofo was asking questions and/or blanking on things like what you mention. I can’t remember the specifics but it was some basic building block kind of stuff. Like what’s an array, or what’s a function, or how do we send another number into this function. I think the neurons storing that info got pruned to save me the frustrating memories.

      I just remember my internal emotional reaction. It was sort of “are you fucking kidding me” but not in the sense that somebody blew off the assignment, was rude, or was wrong about some basic fact. I have ADHD and years ago I went through some pretty bad periods with that and overall mental & physical health. I know the panic of being asked to turn in an assignment you never knew existed, or being asked about some project at work and just have no idea whatsoever how to respond.

      This was none of those. This was “holy shit, this guy has never done anything, how did he even end up here?”

    • squaresinger@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Tbh, as a dev knowledge of transistors is about as essential as knowledge about screws for a car driver.

      It’s common knowledge and in general maybe a little shameful to not know, but it’s really not in any way relevant for the task at hand.

        • squaresinger@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          Well, computer science is not the science of computers, is it? It’s about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.

          We all know how great we IT people are at naming things ;)

          • Ajen@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            2 months ago

            It wasn’t named by IT people, though. It was named by academics. And it’s not about using computers, it’s about computing. Computer science is older than digital electronics.

            • squaresinger@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              2 months ago

              Mhm, and those academics were no IT people and had nothing to do with computers?

              Let’s fact-check that.

              Computer Sciences as an academic course was first created by IBM at the Columbia University in 1946. Because IBM had made their first commercial computer two years prior and wanted to have people who could operate it and who could continue to develop it.

  • Simulation6@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Depends on the context. When my company proposes me to a client for work I am, but oddly during my yearly performance review I am just some smuck who programs.

  • StrixUralensis@tarte.nuage-libre.fr
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    I mean, nowadays you need to be very smart and educated to google efficiently and avoid all the AI traps, missinformation, stackoverflow mods tripping, reading reddit threads on an issue with half the comments deleted because of the APIcalypse etc… sooo you could argue that you’re somewhat of a scientist yourself

  • QuizzaciousOtter@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    I mean, I am applying various kinds of science but I’m not actually doing any science so I’m not thinking about myself as a scientist. What I do is solving problems - I’m an engineer.

    • Corbin@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      The typical holder of a four-year degree from a decent university, whether it’s in “computer science”, “datalogy”, “data science”, or “informatics”, learns about 3-5 programming languages at an introductory level and knows about programs, algorithms, data structures, and software engineering. Degrees usually require a bit of discrete maths too: sets, graphs, groups, and basic number theory. They do not necessarily know about computability theory: models & limits of computation; information theory: thresholds, tolerances, entropy, compression, machine learning; foundations for graphics, parsing, cryptography, or other essentials for the modern desktop.

      For a taste of the difference, consider English WP’s take on computability vs my recent rewrite of the esoteric-languages page, computable. Or compare WP’s page on Conway’s law to the nLab page which I wrote on Conway’s law; it’s kind of jaw-dropping that WP has the wrong quote for the law itself and gets the consequences wrong.

      • colmear@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        I‘d honestly be interested where you are from and how it is in other parts of the world. In my country (or at least at my university), we have to learn most of what you described during our bachelors. For us there is not much focus on programming languages though and more about concepts. If you want to learn programming, you are mostly on your own. The theories we learned are a good base though

        • Corbin@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          I’m most familiar with the now-defunct Oregon University System in the USA. The topics I listed off are all covered under extras that aren’t included in a standard four-year degree; some of them are taught at an honors-only level and others are only available for graduate students. Every class in the core was either teaching a language, applying a language, or discrete maths; and the selections were industry-driven: C, Java, Python, and Haskell were all standard teaching languages, and I also recall courses in x86 assembly, C++, and Scheme.

  • billwashere@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I have been coding since I was 10 years old. I have a CS degree and have been in professional IT for like 30 years. Started as a developer but I’m primarily hardware and architecture now. I have never ever said I was a computer scientist. That just sounds weird.