#DearMastomind: What is it that actually makes system performance under heavy Web use slow?
Is it just memory exhaustion? Disk swapping? CPU / cores?
(Omitting any consideration of network / bandwidth here. Just looking at slow-as-fuck desktop/laptop systems.)
This with an eye to speccing out some new kit. Guides to what to look for / avoid / what's unnecessary expense would be handy.
(Main driver is an iMac 17,1 Intel Core5 8 GB RAM & Fusion drive, which I manage to pig out routinely.)
Thinking 16--32 GB RAM may be a minimum. Principally driving Firefox on Linux. Known high-water mark is 1750+ tabs. Yes, I know I have a problem, thank you for caring.
I've written Chrome AND Chromium, as well as anything based on them, out of my life.
Other loads are typically far smaller, though there may be some compiles, occasional large datasets (postgresql, sqlite, R, Python), and document compiles (LaTeX, pandoc), light audio/image edits. Mostly I live in bash / vim / mutt if at all possible.
#ComputerHardware #SystemPerformance #Firefox #Webbrowsing
@bthylafh The problem with leaks is that feeding them merely fuels the fire. They'll eat all you give 'em.
I've been a strong fan of limiting browser resources per tab (or per site) to something exceedingly limited.
As it is, I limit JS sharply, generally through uMatrix, disabled by default. Of course, it's hard to ascribe specific performance issues to specific sites.
Why a site must remain fully in memory and consuming CPU cycles when not focused is ... a puzzler. There are a very few application sites that really must do this, but generally, no.
(Mastodon might be one of those.)
I'm also somewhat inclined to abandon Web (or Electron) apps in favour of native where possible. And of graphical Web for text (I do use w3m heavily, where possible).