We asked professionals if they wanted Apple’s desktop, and they all said no.

  • shanghaibebop@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    2 years ago

    Disagree.

    All the software companies i work with has switched to MacBook Pros as their mainline professional laptop of choice in the past decade.

    It’s literally a better product for most of developer work and much easier to support.

    • TheTrueLinuxDev@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 years ago

      Well, I’ve worked for the government (as contractor), corporations, and small businesses, I could count a few times I’ve seen people using Apple Mac Pro devices on one hand (more often seeing Macbook Pro rather, but very rarely for development) and more time than I can count on either Linux or Windows workstation computers.

      We use Linux desktop often, because most of our servers are running on Linux so it helps to have version conformity when matching up with server’s versioning and we occasionally use Windows for Visual Studio, proprietary software and so forth. But there are a few times where we get discounts for buying software for Linux rather than Windows.

      Employees in my office switched from Apple Macbook Pro to Windows/Linux based laptops like Framework Laptop, because Macbook Pro often time lacked GPU that you would find on Linux and Windows workstation. Apple is going off on it’s own little world with their own Metal API/GPU and it doesn’t reflect the reality in real world emerging technologies. For instance, there are some computational challenges that in my office, we make use of Vulkan Compute so that we can purchase both Nvidia GPU and AMD GPU to generate real-time data, had we used Metal API and Apple’s products, it would’ve been cheaper to purchase cloud compute servers. (We wanted to ensure each developer can test the given Vulkan code on their own desktop/workstation.)

      • shanghaibebop@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        2 years ago

        My experience has been all GPU-intensive workflows have been pushed to the cloud. It works a lot better for CI/CD purposes as well, and most of the larger datasets are too practically large for your laptop, it ends up being prohibitively slow to download datasets from databases to your own laptop and then train on your local machine.

        I could be biased since most of my network is in the startup scene in SV, where hardware cost is generally the LAST thing most companies worry about. I haven’t seen a non-mac software company that’s not a 5000+ dinosaur person company.

      • MicholasMouse@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I cannot think of a single person at any of the places I’ve worked in the last 7 years, small or large, that has used Windows with a single exception. That exception was me because I needed to use Visual Studio, and I was miserable the whole time. Now that I’ve swapped to a MBP you couldn’t give me enough of a raise to get me to go back. Pretty much everyone I know in DevOps, SRE, IT, and development avoids interacting with Windows unless it is physically impossible. I don’t think I am an exception either. My entire college education was done on Macs (not I chose to use a Mac; every computer in the comp sci building was a Mac). Everyone used Macs while I was interning at Good Year. Everyone used Macs when I was doing ML research in academia. All of my friends who have stayed in the industry use Macs regardless of what role have used Macs. Honestly, it would be surprising to me to hear someone didn’t use a Mac, unless they got a laptop and installed a flavor of Linux as the main OS.