• Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
  • RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    7 days ago

    I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.

    I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        7 days ago

        I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

        I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

        Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

        Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

        But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.

        • nosuchanon@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 days ago

          I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

          I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

          Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

          Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

          It is definitely coming and fast. This was always Microsoft’s plan for an internet only windows/office platform. Onedrive and 365 is basically that implementation now that we have widespread high speed internet.

          And with the amount of SaaS apps the only thing you need on a local machine is some configuration files and maybe a downloads folder.

          Look at the new Nintendo Switch cartridges as an example. They don’t contain the game, just a license key. The install is all done over the internet.

  • xthexder@l.sw0.com
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    2
    ·
    7 days ago

    Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage

    Lol, this is news? Where have they been the last 15 years?

    In other news, the sky is blue.

    • BlueMagma@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 days ago

      I’m using Linux on all my pc. The ram problems exist here too. Firefox is taking the most, the slack app is taking a big chunk too. Linux is not exempt from badly written code, it’s everywhere and nobody seems to care about optimizing their code’s memory usage anymore.

      • Petter1@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Sometimes it is even worse, as some apps get native clients on mac/windows but for linux, the webapp packaged as electron is “good enough”

      • Soup@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        Hardware keeps getting kore powerful but programs aren’t really doing anything more for us. Games look pretty much the same at anything but the highest settings and a browser does the same shit it did ten years ago, but with all this hardware the devopers stopped giving a shit. “Who cares if it’s good for a 2070, a handful of 5090s exist!”

      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        I would be interested in seeing how you have it set up that firefox or linux are using any substantial amount of ram. That wouldn’t have anything to do with ‘badly written code’.

      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        6 days ago

        Correct! The difference is the OS.

        Windows is a ram hog. Using 4GB or more just to exist. Linux uses 1-2GB, sometimes less.

        Microsoft FORCES electron web components.

        Linux has choice.

        So yes, linux has electron as well, but Linux is a lot lighter and nowhere near a hog like windows.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 days ago

      I will once Nvidia gets off their asses and properly implements support for the Nvidia App in Linux. I’ve tried the alternative control panels for Nvidia GPUs. They suck.

      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        Don’t remind me, I am high on copium right now. I can’t even play any of my steam games, but I have my Miyoo Mini+, so I’m surviving haha

  • BlueBockser@programming.dev
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    7 days ago

    Yeah, the RAM shortage is definitely to blame on Electron. Won’t someone please think of the poor AI companies who have to give an arm and a leg to get a single stick of RAM!

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 days ago

      I wouldn’t mind so much if they were giving their own arms and legs, but they seem to be giving ours.

    • HugeNerd@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 days ago

      If you have a better way of generating videos of absurdly obese Olympic divers doing the bomb from a crane, I’d love to hear it.

  • kalpol@lemmy.ca
    link
    fedilink
    English
    arrow-up
    21
    ·
    7 days ago

    And here I am resurrecting Dell laptops from 2010 with 1.5gb DDR RAM and Debian

    • mcv@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 days ago

      I remember when they changed the backronym for Emacs from “Eight Megabytes And Constantly Swapping” to Eighty. Megabytes. Or when a Netscape developer was proud to overtake that memory use.

      What’s the point of more RAM and faster processors if we just make applications that much less efficient?

      • The Quuuuuill@slrpnk.net
        link
        fedilink
        English
        arrow-up
        9
        ·
        7 days ago

        “unused ram is wasted ram”

        yeah yeah yeah, great. but all you motherfuckers did that and i’m fucking out of ram.

        • pftbest@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 days ago

          This phrase is just plain wrong. Unused ram is used for the page cache by the kernel. You must always have some ram free or else the whole system will not operate without a page cache. Larger page cache allows to cache more files from the file system.

        • Korhaka@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 days ago

          I want to run more than 1 process thanks. So fuck off with you trying to eat 3GB to render a bit of text.

    • The Quuuuuill@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 days ago

      what’s google got to do with it? this is an article about a product develeped at GitHub (now a microsoft subsidiary) causing problems with Windows and the thumbnail is showing produts from the following companies:

      • facebook
      • discord
      • microsoft
      • microsoft
      • microsoft
      • microsoft

      like. look. i hate google. they partner with israel to conduct genocide (don’t use waze, btw, or better yet, don’t use any google products). but this seems like not looking at the whole of how evil all of big tech is just to focus on how evil one company in big tech is

      • rdri@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        6 days ago

        The article mentions Chrome/Chromium: 9 times
        The article mentions Google: 0 times

        Google made Chrome. Chrome had that multi-process architecture at its core which allowed to consume as much memory as needed even on 32-bit OS. Chromium was always inside it and open source. Then they created CEF, which allowed webdevs to build “real” apps, and that opened the floodgates. Electron was first built on it but they wanted to include Node and couldn’t because it required too much experience in actual coding. So they switched to Chromium. It didn’t change much in the structure, just basically invited more webdevs to build more “real” apps (at 1.0 release Electron advertised hundreds of apps built with it on its website).

        Google could do something about how the web engine works in frameworks (that don’t need that much actual web functionality), but didn’t. They invited webdevs to do anything they want. Webdevs didn’t care about security because mighty Google would just publish new Chromium update eventually. They never realized they don’t need more security in their local “real” apps gui that connect to their websites because there is not much room for security danger in such scenarios. They just always updated the underlying engine because why not. Chromium dll is now at 300 mb or something? All of that code is much needed by everyone, is it not?

        So, for me the sequence was always seen as this:

        Google (caring about webdevs, not OS) ->

        Webdevs (not caring about native code and wanting to sell their startup websites by building apps) ->

        Reckless web development becoming a norm for desktop apps ->

        Corporations not seeing problems with the above (e.g. Microsoft embedding more stuff with WebView2 aka Chromium)

        So yes, Google has everything to do with it because it provided all the bad instruments to all the wrong people.

        Personally, I don’t care much about hating Microsoft anymore because its products are dead to me and I can only see my future PCs using Linux.

      • Turret3857@infosec.pub
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        CoMaps is a good alternative to Waze. If you think it isnt make an OSM account and help make it a good alternative :p

  • Lightsong@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 days ago

    I have couple of old 8 gb sticks from my old 960 GPU pc. Is there any way for me to stick it onto my new pc and have only certain app use it and nothing else?

    • towerful@programming.dev
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      7 days ago

      Only for multi CPU mobos (and that would be pinning a thread to a CPU/core with NUMA enabled where a task accessed local ram instead of all system ram). Even then, I think all ram would run at the lowest frequency.
      I’ve never mixed CPUs and RAM speeds. I’ve only ever worked on systems with matching CPUs and ram modules.

      I think the hardware cost and software complexity to achieve this is beyond the cost of “more ram” or “faster storage (for faster swap)”

    • Logical@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      As to whether it’s possible to get certain apps use specific physical RAM sticks, I am not sure, but that seems unlikely and would probably require some very low level modifications to your operating system. But even before you get to that point you’d have to physically connect them to your new motherboard, which will only work if there are both free RAM slots on it, and your new motherboard has slots for the same generation of RAM that your old PC uses.

  • DannyMac@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 days ago

    I’m tired of this! How can we start our own RAM foundry–is that the right term? Surely there’s a YT tutorial somewhere.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      7 days ago

      The latest semiconductor manufacturer specializing in RAM is ChangXin Memory Technologies

      As of 2019, CXMT had over 3,000 employees, and runs a fab with a 65,000 square meters clean room space. Over 70% of its employees are engineers working on various research and development related projects. CXMT uses its 10G1 process technology (aka 19 nm) to make 4 Gb (gigabit) and 8 Gb DDR4 memory chips. It has licensed intellectual property originally created by Qimonda.

      So… whatever that costs. Although, I think this wiki is a bit behind the times, as they’ve got DDR5-8000 memory in flight according to TechInsights.

      • towerful@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 days ago

        It must take so much R&D to achieve anything remotely comparable to what Samsung, Micron (/Crucial… RIP) and SK Hynix can produce.

        Fingers crossed they can either undercut the 3(now 2) big producers, which is doubtful. But hopefully they can help reduce the maximum price that decent memory can inflate to. Because at some point a medium sized customer is gonna get fed up of the Samsung/micron/skHynix bullshit, and custom order the ram they need, and such a smaller producer will provide a much better service for a similar price

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 days ago

          The miracle of the Chinese Economy (and, really, all the BRICS countries) has been their willingness to educate and industrialize their population.

          Yeah, it takes a ton of R&D, but when you’ve got 1.4B people you’re going to sift out a few who can get the job done. India’s Tata is already building their own semiconductor facilities. Brazil’s semiconductor sector has been struggling to break into the global market for… decades. Russia’s so sanctioned that they’ve got no choice but to go in-house. South Africa is finally building industrial facilities to match their role in the raw materials supply chain.

          I would suspect this crunch in the global market is going to incentivize a ton of international investment in manufacturing entirely to meet domestic demand. And heaven help us all if there’s an actual flashpoint in the Pacific Rim, because that’ll shut down the transit that companies like TSM and Broadcomm need to produce at current scales.

          I just wouldn’t hold my breath, especially under the current protectionist political environment. You’re not going to be buying outside of the US sphere of influence any time soon.

            • UnderpantsWeevil@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 days ago

              Apple’s willingness to offload all of their production there and basically revolutionize their Tech industry

              Taiwan’s FoxConn building assembly plants in Shenzhen in 2005 does not explain why Huawei is releasing cutting edge phones in 2025.

              Besides, if you want to get historical, Apple cribbed all their technology from Microsoft’s trash bin back in the 90s. And Microsoft plundered IBM and the early tech companies of the 1980s before that.

              Chinese firms didn’t cheat by licensing the same technology every American firm was outright stealing through reverse engineering.

          • towerful@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            Pretty sure all ram manufacturers are Korean? I guess China puts chips on PCBs, maybe? But South Korea has the knowledge . And it had met domestic demand. RAM prices have been acceptable for many many years.
            It’s the AI sector that is inflating demand (maybe by circular investment and contracts).
            So, I don’t see anyone investing 10 years into the future to make ddr6 ram where their business plan relies on current trends.

            • UnderpantsWeevil@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              7 days ago

              Pretty sure all ram manufacturers are Korean?

              Micron is American, headquartered in Boise, Idaho. Western Digital is based in San Jose, California. Kioxia (formerly a department of Toshiba) is Japanese.

              Only Samsung and SK Hynix are Korean.

              So, I don’t see anyone investing 10 years into the future to make ddr6 ram where their business plan relies on current trends.

              Even if you’re not up to DDR6, there’s money to be made in lower-tier memory for lower quality devices. Also, when the market is in a pinch, you’ll have the ability to scale up with investment dollars faster if you’re already in the business.