

Waiting for Canonical to up sell proprietary utils features by subscription. Ubuntu’s regular release cycles were brilliant in 2004 when there weren’t a lot of alternatives but why does it still exist?
Waiting for Canonical to up sell proprietary utils features by subscription. Ubuntu’s regular release cycles were brilliant in 2004 when there weren’t a lot of alternatives but why does it still exist?
It’s a bad combo in my opinion. The HDMI forum hates Linux so we mostly use display port. If you need HDMI 2.1 or higher for 8k I don’t know if it will work. It might end up with a really low frame rate. It is a crazy low end graphics card for 8k. That’s a low end 1080p card as far as games go. DRM is a problem with crappy companies like Netflix, so you will probably be watching upscaled standard def pictures. They must want us to pirate.
Spam, spam, spam
It isn’t relevant to the Linux kernel at all. Even though Torvalds wrote git to support Linux development they operate on a different development model (email, patch sets etc). It is very relevant to the wider ecosystem (Linux distro vs Linux kernel). Most open source software development is hosted on one of these platforms and even non-developers sometimes need to interact with them. Anyone starting a project or looking to share it finds themselves asking the same questions.
I prefer this sort of engagement farming question to the ones asking which laptop to buy or which distro or desktop environment is best. Though it is arguably healthier and more productive for me to be doing almost anything else with my life. I increasingly feel like I am filling out a captcha every time I answer such a question. It feels like something any reasonably competent human could discover trivially hitting a small number of websites and reading. Even the people who cut and pasted low effort LLM responses pretty much nailed most of the facts - arguably more than good enough. What is the point of participating here really?
In my opinion Github in its current incarnation mainly exists to steal the IP of programmers and lock it up in proprietary AI services controlled by Microsoft. It dominates for the same reason Facebook or Youtube dominate. It is the only platform normies know and it benefits from massive network effects. It is US owned and operated which is becoming an issue for lots of people. Github is a proprietary closed source platform. I believe it was originally mostly written in Ruby but they have likely replaced all the performance bottlenecks using other languages. In my opinion their site is a usability nightmare.
Forejo is a fork of Gitea by Codeberg, a community run non-profit from Germany (still a liberal democracy under the rule of law) and hosted in Europe. They provide free hosting for open source projects or it is easy to self host. Gitea is a fork of Gogs and remains active. All those forks are written in the Go language and it requires a single exe, a config file and an sql database to run making it very easy to self host even without containers.
Gitlab is a service like Github or Codeberg that can also be self hosted but it is written in Ruby, a slow and inefficient interpreted language, which like Javascript or Python has lots of crazy fragile run time dependencies. The open source project was originally a work of Dutch and Ukrainian programmers and it was a Dutch company but they took VC money and IPOed and I don’t know that I would assume it is European controlled. Some open source projects like Gnome moved there as it was the main alternative to Github. Can’t recommend vs Gitea/Foejo for self hosting.
For single developers, small groups, arguably all you really need is git and email if you don’t need or want all the extra fluff. That can work even for large projects like the Linux kernel. Sites like github tend to serve as single points of contact for lots of projects. It is their front page, issue tracker, everything which is one hell of a dependency on another company. It has Facebook-ized the code ecosystem. I think it also sort of serves as a linkedn for some people.
I think the ctrl-y vs cmd-shift-z was a Windows vs Mac thing. A lot of commercial gui software originated on Mac including Photoshop (and much of Microsoft Office) and Mac remains popular with the creative crowd. Older Linux gui software used to be weird, either cde/motif stuff or things that looked like they were developed on an Amiga. Keyboard standardization was never a thing with linux - eg emacs and vi.
I believe ctrl-shift-z is standard across many Gnome and KDE apps now. All the ones I could quickly test anyway. Inkscape and Gimp kind of do their own thing but Inkscape definately has ctrl-shift-z showing as the primary redo shortcut for me although it seems to support control y as well. So I think Gimp is just weird as usual. The UI doesn’t conform to the expectations of contemporary Linux users let alone people from other platforms. I would probably just assume Gimp was broken, close it and open Krita instead.
AMD is by far the best choice for foss drivers. Intel might be an option in the future but I have no experience with their new cards. A second option would be good for Linux users but it’s unlikely to be NVIDIA.
Systemd provides a modern user space which fixes a huge number of problems. At first I found it difficult to learn but it had things I needed and I made the effort. I will always be nostalgic about things before systemd because I started using linux in the mid 90s.
I’m not going to throw away my GPU and multi-core CPU and go back to a 386 running dos because multithreaded applications and speculative execution scare me. There is no way to match what modern systems can do by taking old architectures and just adding more gates or faster clock speeds. And there are parallels in software architecture.
There is absolutely nothing wrong with running a BSD or a non-systemd linux distro if you like. They are still perfectly usable in a lot of situations doing the same stuff people did for years in these systems. If you have a server with a static set of devices that runs a fixed set of services at startup you don’t really need systemd. I still have some systems like that but systemd also handles those cases more efficiently and robustly.
You see these sort of link dumps from people who think vaccines cause autism or that some diet will cure cancer. Whatever the intention behind it I always associate it with a bad faith attempt to fuck with people’s heads by bamboozling them with more information than they can rationally analyze.
Believe what you want but you might want to consider that all the experts working on systemd and using it productively might know their shit.
I recognize many of the deficiencies of Gnome but on balance I still like using it.
I never migrated from Windows or Mac desktop. I got into Linux before WIndows 95 came out and although I had used Mac and Amiga desktops I never owned one myself. I have used tiling wms and plain wms with no desktop environment and I can find my way around on Windows 11 or Mac but I don’t like either as much as Gnome. KDE generally has a better foundation thanks largely to qt but I never enjoyed using KDE. Not surprised it is very popular with the new influx of Windows refugees. To me KDE always had a slightly dated Windows look and feel to it. It is still a very solid choice ofcourse.
Gnome hate became fashionable when they moved forward from Gnome 2 and some people never shut up about it. We get it. Your favorite band aren’t teenagers anymore and decided to make an album you don’t like. It is ancient history. Just use something else guys. Plasma is pretty damn good so use it. Whats the point of a free OS if you can’t accept people want the freedom to develop and enjoy different computing experiences.
What is socialist to Linus is libertarian to Eric Raymond. In huge collaborative communities people need to learn to get along and be tolerant but if they can’t handle that they also have the freedom to fork so there is room for everyone to find their own space.
For people with experience with any mobile 12th gen intel and the framework 13 AMD, can you quantify what you think the upgrade is worth or would it be better to wait for a refresh to the “ai” series if that ever happens.
I look at the price for board/ram/wifi upgrade and struggle to justify even though I expect the amd cpu to be cooler/quieter and have much better iGPU. I know it should easily outperform the steam deck in raw performance so with some scaling it should be reasonable for some light casual gaming but I don’t have any experience with amd outside of desktop cpus and dedicated graphics. Every time I consider an upgrade it makes more sense to buy desktop upgrades and cope with the intel system for a few more years. I don’t have a good use for the intel mainboard as it doesn’t have much expansion, multiple ssd, pcie etc.
Most mobile/laptop devices should be encrypted by default. They are too prone to loss or theft. Even that isn’t sufficient with border crossings where you are probably better off wiping them or leaving them behind.
My desktop has no valuable data like crypto, sits in a locked and occupied house in a small rural community with relatively low crime (public healthcare, social security, aging population). I have no personal experience of property theft in over half a decade.
I encrypt secrets with a hardware key. They are only accessed as needed. This is a much more appropriate solution than whole disk encryptiom for my circumstances. Encrypting Linux packages and steam libraries doesn’t offer any practical benefit and unlocking my filesystem at login would not protect from network exfiltration which is a more realistic risk. It adds overhead.and another point of failure for no real benefit.
There is a whole world of obsolete stuff nobody will ever do with a linux system anymore. Terminal servers with lots of serial terminals or modems for a BBS. Making a fax server, IVR, digital answering machine for analog land lines. Using removable optical or magnetic media. Recording broadcast tv. SCSI, Firewire. It is interesting to imagine what from today will be obsolete in a few years.
I don’t want to see the terminal emulator. No chrome. Needs solid emulation. That’s about it. Still using kitty and it’s got a good balance of stuff I use. I don’t really get the point of ghostty. These things are a bit like browsers, they just display the content and are interchangeable. People get super weird about terminal emulators and window managers.
I have been using Linux since the early 90s. I don’t know it all. I read man pages. I use -h or --help. I read the arch wiki. I read docs. I read source files and examples. Lots of reading. You will never know it all. There is too much information.
You need to know how to find information. It can be tricky. Knowing how to ask the right questions often requires you to know a bit of the answer.
Stumbling about trying to find answers is training the skills you need.
I think it helps if you have a programming background and IT support experience. Not just because you will understand more concepts and terms but because you have already developed some of those skills but some people come from other backgrounds and pick things up really quickly because they have well developed research skills.
No but there is an ideological basis for free software though it is firmly based on practical experiences dealing with the consequences of close source devices.
Red Hat and Ubuntu are business. Debian and Arch are communities. Some of the smaller distros are basically that one guy in Nebraska.
People promote them for various reasons. An IBM employee will have different reasons to the supporter types who latch on to a distro and mascot like it was a football team. Now football, there is a religion. Its all ritual, nothing they do has any practical use, people congregate once a week and in some parts of the world it turns violent.
When the deb users start committing genocide on the rpm users I’ll call it a religion. Until then its just a bunch of anime convention fans arguing about their favourite isekai.
It will be 35 years way too soon. I can’t remember the last time I compiled a kernel let alone what exactly I was doing with a computer in the early 90s.
Its weird that most of the world runs on Linux outside of desktop and we still have these discussions. I didn’t know what a distro was in the beginning. It was a Linux kernel and gnu user space someone had compiled to get people started. If the disk sets had a name I didn’t know or care.
There is no simple answer. Its is almost entirely dependent on implementation. All systems are vulnerable to things like supply chain attacks. We put a lot of trust in phone vendors, telcos and Google.
If you are going to compare to something like termux you need to compare with an equivalent sandboxed environment on regular linux, like a docker/podman container with appropriate permissions. As far as I know they use the same linux kernel features like cgroups and namespaces under the hood.
Traditionally Linux desktop apps run with the full permissions of the user and the X window system lets apps spy on each other which is less secure than Android sandboxing by design. There have been attempts to do better (eg flatpak/flatseal, wayland) but they are optional.
Having comprehensive unicode language coverage on a free OS is amazing. I wish the font system was smart enough to hide Noto variants in creative apps but leave them available for browsers. There is a workaround to do that but its a huge pain. I wouldn’t delete any files managed by the package system. They will just keep coming back anyway. There are smaller collections of noto fonts in AUR that will satisfy the noto-fonts dependency which should keep KDE Plasma happy. They should be a straight swap if you are comfortable with an AUR dependency for a functioning desktop. The newer one is noto-fonts-main updated this year or there is an older noto-fonts-lite. Not tried either. Usual stuff about backups and taking advice from strangers on the internet.
Segoe might benefit more from the embedded bitmap or autohint settings than the regular open source fonts I am likely to use. Microsoft would optimise the hell out of it to take advantage of their proprietary, patented font rendering system. I wouldn’t be surprised if it rendered poorly with distro defaults. Its the kind of blind spot a lot of open source devs and packagers could easily have. Its probably packed full of embedded bitmaps for small sizes and proprietary hinting stuff that linux won’t understand.
Our family does a reasonable amount of editing in kdenlive every week (youtube, education etc). A decade or so ago practically every video editor on linux felt incredibly unstable. I remember trying to work in Cinelerra. Now shit just works. There are a couple of things in the workflow that still need other tools but kdenlive has been fairly solid. It could do with some minor usability tweaks to make it friendlier to people coming from other editors and for beginners. Also I wish the gpu acceleration (movit) was stable enough to be enabled in MLT in kdenlive builds. Focussing on stability makes sense though.