Who would have any idea that software packaging software would cause such a fuss? But this is the case with snap. Developed through Canonical as a faster and easier way to install the latest software versions on Ubuntu systems, the software ended up launching a heated debate in the Linux community in general. For the most casual user, snapping is just one way to get the software they need as temporarily as possible. But for users involved in the ideology of free and open source software, this is a damaging step toward the types of “closed gardens” owners who would possibly have taken them to Linux in the first place.
Perhaps the ultimate virulent snap opponent, and in fact the one that has attracted the maximum media attention, is Linux Mint. In a June 1 article on the official distribution blog, Mint founder Clement Lefebvre made it clear that Ubuntu’s split did not approve the new package format and would not come with it in the critical installations. In addition, it announced that Mint 20 would actively prevent users from installing the Snap framework, the package manager. It can still be installed manually, however, this is noted as a way to prevent it from being added to the formula without the user’s particular consent.
The abbreviated edition of Clement’s complaint is that the instant packer installs itself from a canonical proprietary source. If you want to distribute snapshots, you want to create an account with Canonical and host it there. While the underlying software remains open source, the instant packer breaks with the long culture of distributing the software also openly and free of charge. This certainly makes installation undeniable for naive users and less difficult for Canonical administrators, but also removes the freedom of selection and the diversity of package sources.
To perceive the situation, we probably deserve to take a step back and take a look at what snapshots are. Simply put, they are containerized software packages that come with the libraries that the given program wants to run. The concept is that developers can simply publish a non-married add-on software component that would necessarily work on any trendy Linux system, rather than having to create specific distribution packages. In theory, this saves time and effort on the developer component and ensures that even users of more specialized distributions can access the software of their choice.
Of course, there are drawbacks to the distribution of software like this. On the one hand, an instant package will be larger than a classic package for the same program, because all dependencies must be sent with it. Since many formulas will naturally have the same dependencies, this means that a formula with many snapshots installed will unnecessarily waste the garage area on redundant data. Although entry-level formulas now come with terabyte hard drives, this would possibly not be as worrisome as it would have been in years past.
Snap packages also tend to be slower to run, in the component because they are photographs of compressed log formulas that want to be repaired before they can run. Some users consider this detail to be annoying from a formula maintenance point of view, because each and every one of the instant packages you install will look like a fixed logging formula.
It was reported that a special indicator was added to the corrected instant packages so that regular equipment like mount or lsblk does not demonstrate them, however, this clearly leads to their own problems. After all, it’s helpful to know how much disk area they occupy.
For example, let’s look at how the Snap package compares for a non-unusual tool to its direct installation:
As you can see, the difference is substantial. If we download youtube-dl directly from the developer’s website, the script only takes 1.7 MB on disk. But the instant package of the same program weighs 91 MB. It is transparent how this challenge will worsen as more snapshots are installed.
That said, there is a call for this type of “universal” Linux package. Enough for there to be at least two other competing approaches that paint according to similar principles, Flatpak and AppImage.
From a formula resource point of view, containerized packages are obviously not ideal. On the other hand, many would be more than satisfied with taking the hit if it meant they had access to the latest versions of popular formulas without having to wait for them to reach their distribution’s local package repository. Users deserve not to know for themselves which direction they want to take based on their non-public needs.
This is what makes Canonical control of the Chromium package on Ubuntu 20.04 so disturbing. Let’s take a closer look at what happens when you check to install it:
While we ask the formula to install the local package, what we get is the add-on. The user has no choice, no warning. If they didn’t pay enough attention, they wouldn’t even realize what happened. To the threat of sounding too dramatic, it’s subversion.
There are certainly valid reasons why Canonical would need to distribute Chromium in the blink of an eye. Instead of creating editions for each of the supported editions of Ubuntu, they can release a non-married add-on software component that will fit each one. This is especially true for previous editions of Ubuntu LTS (Long Term Support), which might crash in a different way in an earlier edition of the browser due to the replaced formula libraries.
By using this “furtive” installation approach to the Chromium snap software component, they can ensure that the procedure is as undeniable and painless as imaginable for their users. In fact, most probably wouldn’t even realize that the substitution has occurred.
But for those who’ve noticed, it’s a big problem. Many users have abandoned proprietary operating systems in particular to escape this type of behavior. These other people should be the referee of their own PC and do not make vital decisions on their behalf without even a warning. These are the users Clément Lefebvre had in mind when he promised that long-term versions of Mint would never install instant packages without prior consent.
While Canonical is no stranger to reversing unpopular decisions, instant packages are almost here to stay. The logistical benefits of containerized packages are too wonderful when your entire business is structured around compatible versions of a Linux distribution. Conversely, users who have strong emotions about shots will inevitably be a small minority (if vocal). Canonical-designed snapshots are the solution to the unique demanding situations of maintaining a huge, multifaceted distribution like Ubuntu, and it works exactly as expected.
That said, instant packages are unlikely to be followed through the larger Linux community. Currently, the repository backend is owned. Although Canonical allows corporations to create “branded” versions of snap Store, this is just a cosmetic replacement and does not allow you to run your own server. Therefore, even if some other distribution like Mint made the decision to adopt the instant package format, it would have to rely on Canonical to supply the infrastructure to distribute the packages to its users. This single fault point is necessarily a talking point for adoption outside Ubuntu gates.
Snaps don’t make sense for smaller, more exclusive applications, but they make more sense for more vital things, such as. I use Nextcloud Snap. I can, of course, install all the necessary parts manually and configure them, and so on, but it is much less difficult and faster to install the Snap package and finish it.
I just want Canonical/its developers to avoid Snap packages for things they don’t make sense about.
I can’t wait to have the first 200GB text editor.
It’s an Array that’s going to get worse. Return to Mom Send Debian
I agree
Anyway, it’s a good idea. The only explanation for why I use Ubuntu is because its installations tend to paint more than Debian’ installations. I think if some of the Ubuntu efforts were approved in Debian, everyone would be happier. I also believe that if the other Mint people had started with Debian rather than being a declining point in the food chain, it would also have been higher.
A long time ago, Ubuntu had a greater help for newer hardware. Is that still the case? I installed Ubuntu on host systems and Debian on guests. Ubuntu also had a preconfigured appearance, also suitable for host systems. Fortunately, since I use Ubuntu Server as a desktop computer (with i3-wm) and I have not installed snap, my Chrome is micropacked, although running Chrome on a virtual machine with spices is also not very bad.
Neil: I’m not exactly a distribution expert, but Ubuntu has a tendency to be much more cutting-edge and up-to-date than Debian, which tends to take measured measures to keep everything stable.
That said, I would say you can run Debian if you want quite easily. In the worst case, reduce some updated packages to help your new complex hardware as needed and expect it to work in general. Certainly, I have never had problems with Debian on this front, I would say that the trend you mention is still there.
Debian is an impressive distribution, even with systemd, it’s smooth enough to run on my 2006 NAS with less than 200 MB of RAM. It also supports enough new technologies so that I can (barely) run SteamVR on my main PC, something I would never have expected from Linux, even five years ago. Not that Debian made all the paintings, but Valve has faithful many paintings to Linux support, only debian doesn’t deserve its reputation as an obsolete dinosaur.
Meh. For my part, I’m with Ubuntu and I’ll stick to it.
I am satisfied with Linux Mint, and I’m glad you do.
Yes, I tried to get back to the mother ship: XFCE in Buster. First I had to go to really inconvenient stretches to get an Ethernet connection because my WiFi controllers are not free. So “sudo modprob” did not paint while looking to run the drivers. It took me a little study to discover that the superusers’ routes are not correctly explained in this incarnation of Buster and that ‘su’ was not installed, so I had to do ‘sudo bash’ and run it from the command prompt. Unable to run WiFi: despite everything, I discovered an unofficial ISO that included WiFi drivers, I did a complete reinstallation and then the painted WiFi. But my Trackpoint still can’t be used; it was fine during the graphics installation process, but under the full operating system, you’re as nervous as a meth boss and none of the configurations I’ve tried so far does. Hibernation is enabled by default, but it does not work correctly. Power control does not work properly: when my computer is on and I close the lid, it is intended to continue working, but it stops and does not work for me.
I enjoyed Debian when I was at Squeeze and Wheezy. The upgrade to Jessie broke a lot of things and I couldn’t make it work, so I jumped to Xubuntu. Because I’m late for an upgrade and canonical pulls in the instant described in history, I review ‘the mother ship’. So far, I’m not inspired and I’ll probably break up with Mint. This means that I will have to do a full reinstall when Debian moves a new candidate to Stable, but at least in the meantime, I will hope that the full feature will work.
Return to the Debian mother ship
Not as they are [SystemD] oomed.
It’s also been a while … Real Debian – Devuan … much less infiltrated through ubuntu and gnomeisms.
But Debian is so old (apparently) that it is simply unusable. I’ll move on to Arch before moving on to using Debian as the main operating system
Maybe you, Emacs? :RE
They make sense even for small programs if those programs have unsafe dependencies, such as 32-bit libraries or Python 2.7.
Ideally, those programs were refactored to eliminate those dependencies: however, it is a regrettable fact that some programs are no longer supported and that dependent users will have to migrate out of them or take MaintenanceArray … a little more life from them.
There are arguments for anything like a coup; However, for less difficult use, disk space sacrifices, performance consistency, transparency, and the use of package consistency formula libraries appear a lot.
Or is it possible that they DO NOT leave libraries just because they can?
That’s exactly what it is. I’ve been addicted to SageMath for a long time, a Frankenstein or some other software I discovered almost to set it up manually, so it’s harmonious and even binary tar files are addictive nightmares.
I know I use it through the docker and I don’t care if it worked so well with Snap if it started working.
Personally, my Nextcloud is set up manually but what happens is that here we have a selection.
Canonical doesn’t owe you a Chromium binary package and if you want, they’ll give you a snapshot.
Snap is a security disaster. No thanks
Honestly, this reminds me of the nightmare that is systemd. I came to Linux because I was looking for strength and array not to babble. When I want to be fooled, I take out my Windows or Mac box, they paint and do their job, but when I want exactArray I use Linux. I hate those efforts to make Linux easier to use at the expense of (friendly and hassle-free, I’m very pleased to see it).
I just need to know that when I do X, X will take place without the A-W appearance effects. For me, this article is just another Mint point and another kick for Ubuntu. Ubuntu has been strange to me, it still needs the name linux to claim in some other way that it is a Mac, I do not need a Mac, I need a Linux machine. Although I suppose it’s the joy of various distributions.
Fast with apt installation glued on top. Does the installation use snap because there is a hidden replacement of the repository plug-in in apt or Chrome replaced its ubuntu-capable repository to only issue the repository plugin, i.e. who is guilty of the replacement?
They addressed this in a recent episode of the Ubuntu podcast: switch from Chromium to a vastly simplified packaging and for all supported versions of Ubuntu, which freed developers to devote time to other things.
You sense that as things slowly move towards an “instant” solution with Spotify doing the same thing, you become increasingly dependent on Ubuntu’s single discretion. With apt, it’s incredibly simple for me to create a local repository so that the machines can point to my repository so I can have the software edition I need from this repository and everything works. With snap, you can only use the snap repository and if in five years, while 80% of the software can only be installed from snap, Ubuntu makes the decision that the maintenance of all those snap servers is expensive, we would like $five per month to attach to the repository plugin, what exactly is its alternative? How accurately will you install the software now? Everyone has switched to the practical approach, but not even open source snap remotely.
RedHat has done something similar, but at least they have the decency to allow CentOS to coexist. This movement is an effort only to mimic silver capture tactics, but also to be much more hostile when purging all other distributions. Of course, this frees up time for developers, who want more time to create cool graphics for the store.
I only share the justification they have put forward, for or against.
Canonical owes you nothing (well, the GPL commitments).
You can create your own repository and provide the binary packages you need for the platforms you choose.
Yes, I prefer a “Hey! We haven’t discovered a binary package, can we install it via snap? Quick, but it’s not a deciding factor.”
The challenge is not in the packages you can create yourself, the challenge is in closed-source packages that you cannot rebuild, however, they will be available as “snap only”.
Ubuntu also used to be obsolete for years in packages they didn’t care about. It appears that the Krita package is already updated to the maximum (4.2.9, the existing edition is 4.3.0) Over the last decade, the package has been obsolete for several years. If you were looking for up-to-date software, you had to look for dependencies and build it yourself. Krita started AppImage for x64 only to reduce the time it takes to help others build.
Another note, Ubuntu has been a favorite for years in cross-platform development. I ran it on several other ARM and MIPS platforms. I noticed you paint on Microblaze. Will snapshots simplify this world? Looks like the clichés would break it.
It reminds me of the nightmare that it’s systemd
I couldn’t agree more, that’s exactly why I use “yet” Gentoo. I can do whatever I want because there is not a great dependency on systemd (which is Lotus Init Notes), and this makes me happy. I don’t like being told what I can and can’t do with MY computer, or that I probably want software that I hate.
I can see how useful it is, some packages that were terrible for paintings where it would have been a gift from heaven. However, subverting my selected installation approach is not very good.
The simple solution is ‘apt purge snap’, be sure to do it in my mint.
Personally, I won’t use many snapshots, anyway. I love the package formula as it is now and prefer to go through the hassle of compiling the source to the snapshots (getting a smart understanding of dependencies, which is very useful when something doesn’t work perfectly).
I can see that they have a purpose, and I hope they probably don’t have as many unrest around the dependencies, as it comes with everything you need with edit control (although I bet there will be unrest here and there with strange configurations / Alsa for example). Making it ideal for those who come from walled grass as it will be a familiar and undeniable facility that works only in theory. This kind of thing that makes Linux less difficult to use for non-technical people seem to suffer from community feedback… Which is stupid: if more people use GNU Linux, there will be more cash to pay developers, so Linux will be better for everyone! It is not as if the hugely configurable customizable nature of Linux is going to be replaced; If you really hate it, don’t use it!
As long as Ubuntu is still able to play well with apt and snaps and stay out of the doors of the other well-supported distributions, I am quite satisfied that they exist. Even if Canonical stops packing a lot, I have no doubt that the network will remain so and, of course, the Debian baseline will still be available.
I admit that lately I am using Ubuntu (and Mint/Fedora/Susa probably also some others) and this has thrown me some traps. And I don’t like it being transparent that you get a chance with apt. This is a cheeky habit and not suitable for a Linux distribution!
“This kind of thing that makes Linux less difficult to use for non-technicians seem to suffer the community reaction… Which is stupid: if more people use GNU Linux, there will be more money to pay developers, so Linux will be better for everyone! »
More money?
https://staltz.com/software-below-the-poverty-line.html
New to the “patreon economy”? The vast majority of projects in patreon in general are begging enthusiasts whose projects are an excuse to collect donations to get some extra money for beer, with little commitment to anything. It’s about monetizing your hobby.
I was thinking more: as you make the Linux family common, companies, etc., would possibly not pay Microsoft/IBM, etc. by default.
If more people need to run Linux desktops, game progression houses and publishers will invest more in the computer they need and pressure Nvidia to open their drivers, for example.
This leads other people to think that Linux might be for them, instead of switching to Windows/Mac, which will bring many progression backgrounds. And corporations that are paid to supply facilities or earn cash with users have an interest in making things better. The nature of Linux means that no matter how much those corporations may become, much of the paintings they make will gain advantages in each distribution and networked paintings as a total: it’s hard for them to avoid it even if they tried!
Well, first everyone avoids saying “distro” as the guys in cars avoid saying “transsexual.”
Dealing with the facts. Red Hat and Red Hat – Gold Standard Certification. Red Hat certification as “If you can install Linux, it is certified”. Now it’s an incredibly complicated verification that other people smarter than me fail. Over the course of two days.
I just need to talk for a moment. What you call Linux is GNU/Linux, or as I recently called it, GNU plus Linux. Linux is not an inconsistent formula with a consistent formula, but rather some other loose component of a fully functional GNU formula that is made useful through GNU corelibs, shell utilities and important parts of the formula that add a complete consistent formula as explained through POSIX.
Many PC users run a modified edition of the GNU formula every day without knowing it. Because of a specific twist of events, the gnu edition that is widely used today is called “Linux”, and many of its users are unaware that it is necessarily the GNU formula, evolved through the GNU project.
There is literally a Linux, and those other people use it, but it’s just one component of the formula they use. Linux is the kernel: the formula program that allocates device resources to the other formulas it runs. The core is an essential component of an operational formula, but dead in itself; you can only paint as a component of an entire operational formula. Linux is generally used in combination with the GNU operating formula: the complete formula is a must-have GNU with Linux added, or GNU/Linux. All distributions called “Linux” are GNU/Linux distributions.
Or you can make transfer efforts to click, use busybox, live life as you deserve since you launched Norton/MSDOS
If more people use GNU Linux, there will be more money to pay developers, so Linux will be better for everyone!
Here are the cultural and licensing differences: Global Windows is closed, you need something new, you should skip the M$ documentation that you would say and you can’t update, replace or rationalize anything, you need to get M$ to do it for you. This only happens if they feel it’s in their interest and not yours.
But anything you need to do in a Gnu Linux formula can do it (assuming you have the time and skills). So, if you’re interested in updating the Bluetooth battery, the way X interacts with GPUs to do something like hot GPU replacement of some laptops can work. And if you’ve done it right, the odds are really good, all other GNU Linux formulas will be transferred to your d edition and even more to meet other needs!
Just as vital as if the organization produces something that breaks what it was doing, it doesn’t want to use it at all, where M$ will impose interrupted updates, the most productive thing you can do is delay it…
This https://bugs.launchpad.net/ubuntu/+source/libreoffice/+bug/1729821 is a two and a half year ago error in the FreeOffice add-in that causes you to open documents in SMB sharing. The deb edition simply works.
I’m sure of that cliché. It gives the impression that there are too many things to do in the blink of an eye to make it work. And no one looks at bug reports.
I’m sure many other people will locate a new distribution after this move, and some will start Ubuntu because of that. Mint is probably as plug-and-play as Ubuntu is/was. I use Debian BTW
I definitely recommend Mint to anyone who needs to try Linux, or who doesn’t need the swelling that comes with Ubuntu. This is a wonderful delight for the user that makes running a Linux formula less difficult than Windows formulas.
I invite others to take a look at the remnants of the three-year exercise which is the “Can I have a way to turn off automatic updates?” In its forum:
https://forum.snapcraft.io/t/disabling-automatic-refresh-for-snap-from-store/707/274
It is not so much that snap developers do not need to put it into effect, however, their attitude is NOT TO WANT A LOVE. There is only one categorical refusal to admit that there were different usage instances than they create. This… it doesn’t suit me from other people who put a package manager in place.
Oh, and he insists on installing things in ‘snap’, and you can’t override that default either.
They interfere with reproducibility, they are not suitable for loading boxes and, well, it’s a very bad thing. Very disappointed that Canonical has gone in this direction, and now by a greater base distribution.
I like the AppImage much more.
While snap necessarily provides all the libraries an app needs, AppImage chooses a distribution as the “oldest support” and delivers only anything that is rarely available and installed by default. This also avoids any mess of mounting the logging formula and having to have a privileged daemon through root, by having another mechanism to group libraries.
The compensation is that an add-in is more universal, while AppImage works for a limited time distribution range. But even shots will be limited through how, for example, graphical APIs and central interfaces change.
Or did I get it wrong? Does AppImage use a suid wizard to create a record formula image?
I use AppImage for Cura. It’s pretty bad. I don’t know if it’s worse than breaking.
I also throw Cura through AppImage. Why do you say it’s bad enough? Apart from the charging time, I had no problem …
I prefer to use static bindings, if you have to rely on an express editing of a library. Alternatively, compile with older libraries so you can percent in more places.
I’m here to say the thing.
It turns out that each and every challenge solved through snapshots can be better solved with a static link combined with a “personal directory” application in/opt.
You forgot to mention the flat package and other valid opportunities that provide a number of pros and cons.
I’m surprised that the flatpack hasn’t been mentioned because they have the same functionality commitments, however, you can install what you need and many rely on the appropriate freedesktop-sdk and publish resources there.
Read? Flatpak and AppImage are mentioned.
Canonical reminds me a lot of Microsoft in its inception. They’ll do sordid things to break compatibility in a way that helps them in the market. Unlike Microsoft, they don’t have a monopoly, so it’s hard for them to get away with it. However, this does not prevent them from trying.
Oh, my God, still. Name a DOCUMENTED API that Microsoft has broken.
Removing int 21 and realigning the lparam bits in wparam do not count as 1) is another subsystem and the old code still works under the win16 and 2 subsystem) they told everyone years in advance that this was going to happen.
Now read Raymond Chen’s blog and see Microsoft’s efforts to achieve compatibility in new versions of Windows. This includes fits that support programs that mimic error behaviors.
Undocumented APIs were documented for one reason: the company had committed to maintaining them for the long term. The programmers of that time were lazy and stupid to use. For my part, I never have.
You mean “DirectX”? From Win95 to today, we have to download drivers every few months due to the settings in the underlying operating system. Win10 was the worst, however, the day an XP update arrived and I had to buy a new video card. I didn’t want to be quick. The only goal in life was to connect it to a composite video TV, but Microsoft broke it and there was no new driver.
You chose a bad example. DirectX 12 is the newest, but the nine DirectX apps of 2002 are still fully compatible with Windows 10 today, 18 years later. I still play unreal Tournament 2004 (which I bought in 2004) in my Windows 10 box.
The video card challenge you’re having? These are oems of original video card equipment that have a very, very bad track record of doing some of the worst prohibited, undocumented and lazy things on their drivers or simply avoiding support cards not for technological reasons, but because they need you to buy your newest product. My friends in the video game industry have told me too many stories of video card drivers that make things stupider and break their games.
Personally, I had a computer whose OEM GPU had abandoned when a new edition of Windows appeared, but it turned out that if you replaced a string in the driver installation files with one of its new GPUs, the old GPU worked fine. The OEM simply no longer searched for the old chip.
Yah. As I published in the article Chromebook – Netbook as well. I hate sandboxes!
So I moved my daughter from a Raspberry Pi to a full table with Kubuntu so I can run the latest edition of Minecraft Java. We worked on their Pi (no, not just the Pi edition), however, after editing anything, they gave up 32-bit support.
Well, anyway, the only way to make it work on Linux is to use snap. There were some very long tutorials to make paintings without brooches, but none painted for us.
Then Snap means that all its worlds are stored in a mysterious folder inside the instant log formula instead of the pretty apparent /.minecraft folder where they once lived. I had to move them. I tried a symbolic link, but it didn’t work.
She’s been building those worlds for years, some of us have played in combination and it’s attached to them. Now I’m afraid the next time we move on to the update, I won’t be able to locate them!
I don’t know, maybe I deserve to see Minecraft in Wine. It happens where there are fewer things that obstruct the execution of a Windows program on Linux than when you run everything that is built for Linux. At least it’s not so hard not to forget that everything that is discovered in Wine’s “C:reader” is in Wine’sArraywine/drive_c and that Windows systems can see the fundamental directory or even the entire directory tree at h:/y V:/.
Simply, an old JRE and MC will paint perfectly.
Another big challenge with snapshots is how you completely forget your theme customizations. It’s not just about kindness, it’s about usability. For example, I have to use fonts larger than popular fonts and a red mouse pointer so I don’t lose them. But the clichés say that you, we know more than you what you need. No thanks.
What I didn’t know, I assumed the snapshots would respect your baseline because there’s no explanation for why they shouldn’t be. Can you at least set up snap or snap in general separately?
I think 20.04 LTS has moved in this direction, but it still doesn’t look like the user would expect. Use the settings you’ve already set so that there’s no difference between instant and “normal” applications. (I’m still on 18.04 LTS with Snap formula uninstalled).
My challenge with snapshots is the security around you. I got to a point where I had ffmpeg in the blink of an eye to paint with my NVidia card, but I couldn’t paint on files on … NFS shares? There is no way to disable this limitation.
“But for users involved in the ideology of free and open source software, this is a damaging step toward the types of ‘closed gardens’ owners who would possibly have taken them to Linux in the first place.”
Yes. Yes. Android and Ios, but can you call it owner if you can get everything in the feed? It turns out that other people only need to stretch the words if they don’t apply. This is an open source that even Microsoft couldn’t bring down. And yet we use the language of those who suffer as “closed garden” and “owner” when the keys themselves of our freedom are written in a legal language like the GPL.
The Snap store is literally the owner. The software you install is possibly open source, but is packaged and distributed through a single company without any transparency.
“Mint founder Clement Lefebvre has made it clear that Ubuntu’s split approves the new package format and will come with it at the base premises.”
“In addition, it announced that Mint 20 would actively prevent users from installing the Snap framework, the package manager.”
These aren’t entirely right. Do not disapprove of Snaps themselves, disapprove of snaps and the entire Snap subsystem from installing invisibly. The goal is not to prevent users from installing snapshots from the package manager, but to prevent the package manager from installing snapshots in the background.
“On Linux Mint 20, Chromium would possibly not be an empty package that is installed behind your back. It will be an empty package so it is empty and will tell you where to look to get Chromium yourself.”
Here’s the key. They oppose snapshots themselves, they oppose breaking existing expectations, without préavis_.
Have you read the article or simply insert two paragraphs and write this unnecessary comment?
The rest of the article does not replace the fact that it begins with declarations.
You may be right that you didn’t want to come with quotes from the Mint blog, as they were in this article, but they are evidence that the quotes in this article are incorrect, so I repeated them for clarity.
The fact is, he chose a bachelor sentence to review and give the impression that the article says something he doesn’t say. So either you haven’t read the article, or you check the output to dig deeper into some kind of narrative to discredit it. Which one?
You’re wrong too, by the way. Clem doesn’t approve Snaps at all. The Chromium package is the last drop of water, but even before that, it explained why the concept is a bad concept for Linux.
Coincidentally, you didn’t mention those parts of the blog, which answers that first question.
Bad idea, but not enough. Until the promises are fulfilled.
It does not replace the fact that it is a type of sensationalist to open with those stretches, when the immediate cause of the sudden replacement were the damaged promises and doing things in the scenarios, not just the subsystem itself.
Wouldn’t that lose all the benefits of dynamically similar libraries? That non-unusual files are uploaded only once and live basically in cash?
(sarcastic) That’s the good look of snaps, flatpacks and even containers. They duplicate all dependencies, but still load them dynamically, giving them all the length benefits of static binaries, but with the security benefits of DLL! (/sarcastic)
(Okay, an intelligent exchange of libraries with containers, which is useful on giant systems).)
My understanding of the “DLL benefits” is that it saves disk space, at the expense of the hell of addictions and a box of protective worms. (Okay, maybe also cache if two resident systems use the same libraries). But are the disk area and memory more limiting? Or is it ease of use, ease of installation and protection?
Static Lyfe!
Not on a desktop computer. In a RasPi, SBC or a more limited PC application, yes! Using giant amounts of reminiscent and disk area because “hey, it’ll be there anyway” is a stupid decision.
Or is it ease of use, ease of installation and security? Why is this one or the other? Limiting swelling doesn’t have to mean less ease of use, ease of installation or safety.
This query is as old as the concept of non-public computers. Even the other 486 people tried to tell the other 386 people, “Oh, with 2 megabytes of RAM and endless space on a 60-megabyte hard drive, never have to think about length limitations!” Today, only the Linux kernel sounds like 60 Mio, compressed, and an undeniable “l” would deplete that 2 million RAM.
The disk area and reminiscence have been a limitation and will be a limitation. The explanation is simple: drives and RAM are increasing, but so does the software. The dating between these two has remained virtually the same for about 30 years (or more).
Maybe it’s time to transfer from ubuntu to mint
Reminds me of the progression in Node.js. Before you set a line of code, your mapping can now include 10k files in a large number of MB.
BS’s very strange philosophy of “we know better, we go” – “the newest is better”.
Not really, but rather, “Hey, don’t do things behind our backs that you’ve already promised not to do.”
Snap/Ubuntu developers said snap would never update apt, however, Mint showed that some apt packages have been “updated” to install Snaps in the scenes. That’s what takes them away.
Not what? Isn’t that strange? Isn’t that BS? Isn’t that a mixture of those two philosophies?
Linux ruined the bed when it was in the packages. In the old days, you had compiled everything yourself.
I like other people who need conflicting endings. Many other people have switched to Linux for the security and absence of viruses that seem to affect Windows.
Then, in the old days, I was sure that the source code I was building was healthy and that it had an elegant copy. I mean the best laugh of a black hat would be that you compile our own virus, malware or backdoor. Assuming you have an intact source code, you can at least accept it as true with the user who builds it with malice, perhaps not for stupidity regarding features and the like, but at least out of sheer evil.
Then came the distributions. Of course, they didn’t work well with user-created software because they didn’t know which parts I had created compared to the ones it had. And, of course, he relied fully on them to create the packages they claimed to create and provided them with binaries sorted from the ordered source code. The Linux user’s IQ can begin to fall to the same point as their average PC or Mac user.
People, of course, needed freedom, so Linux distributions had to start allowing other people to target other package libraries and hell broke out for example with respect to uploads and what doesn’t have consistent names or places where they lived. Not to mention that you still have another organization of other people who create your software for you and accept it as true. God only knows who they are. A friend has the line to upload to their repository record from the Internet, so you have to be sure, right?
And now we’re almost in the loop. Necessarily statically connected packets, or if they are dynamically connected, have everything they want with them. You pay for the length of overhead. For a long time, many others looked for such a thing after getting stuck in dependency loops that could not be resolved. Right now, at least, you know that the other people in the Linux distribution are the ones who build it, and I hope you get the software safely. Of course, freedom lovers want to be able to bring the new packages from other sources. I don’t see much difference between that and getting packages from other sources.
Is it misleading to have it enabled by default? I don’t know. I haven’t installed Ubuntu in a long time. You? Oh seriously. Is there a checkbox or a check switch? I suspect this is anything that can be deactivated. On the other hand, for many Linux users in those days, activating it makes things “work”. And I suspect instant packages can be uninstalled if you need to go back to addiction control. I don’t think that’s bad. The cat’s already out of the bag. It’s been decades. Someone just found out that there’s no room to store packages at a much easier but unnecessary level.
The good news is that you can go so far as to download the source of maximum packages, or play the game of prebuilt packages and dependencies. That doesn’t stop it. This only provides more PCs and users with a faster and more reliable solution.
The world has been changed to Linux for many reasons, just for the lack of viruses. It is the ultimate non-unusual operational formula in the world (yes, I come with Android).
Uh… Package managers, which is just one of the things that Linux distributions compared to other operating systems!
You rightly said that SNAP does not prevent you (in cases where the software is in classic and open source forms) from downloading the software and managing dependents. The truth is that this will result in a swelling and a technical branch that the developers will avoid.
Libraries, libraries, libraries. Go back to dll hell.
Shared libraries are the explanation for why other people pull their hair out when they create Linux programs themselves. Application ‘A’ needs at least the five edition of the ‘X’ library, and the ‘B’ app has never heard of edition five, and would probably not even use anything newer than Edition 3. And, of course, editions of library ‘X’ 3 and five cannot coexist in a formula due to incompatible header files or such monitoring. That’s where package managers and packages themselves come from. I do not forget to have to go through a confusing procedure to install Kdenlive (a non-linear video editor) on Ubuntu, in which all its dependencies were installed in the sub-repertoire of Kdenlive’s repertoire of structures. And then there’s Firefox, which turns out to hide all its dependencies, avoiding the libraries installed through the formula. But at least Firefox handles all this smoothly in its installation.
Apple has the idea that it can overcome this challenge by installing programs in the form of packages, that is, directories that contain the applications themselves and their dependencies. This, as reg says, causes its shared libraries to act as statically connected libraries, as each application ends up tied with its own edition of a particular library, with the double consequence of a) requiring much more area on disk and RAM yb) to make many programs use replaced libraries and not benefit from library updates.
And I’m going to prevent here and say that’s how snaps address the library problem.
But then someone at Apple saw the madness of this and proposed something called Frameworks, which, based on all I can see, is another call for percentage libraries, however, they are stored in a formula directory rather than in the application package itself, and several programs have the same percentage of Framework in the same object modules when they run. You know, like percentage libraries. What’s good, it’s rarely very? Well, that’s fine as long as the frames stay well. But, by the way, not everyone was satisfied with Apple’s approach, and not all app developers sought to create a Mac edition of their app, so we also have Brew, which is presented as “the missing package manager for MacOS”, which tries to make Linux-style libraries on MacOS, keeps its PROPRE directory for percentage libraries, separate from the formula directory for Frameworks and the classic unix/usr location.
So this is the long series of snapshots? Will we see snapshot advocates recognize that shared libraries actually have merit and will offer a type of super snapshots that redeploy them in layers?
It reminds me of when Docker appeared and the endless war between those who write and those who deploy and maintain.
They already have, the snapshot environment can be expanded with critical snapshots.
Pd. something I haven’t talked about much in this discussion is that Snap (and Flatpak and appimage) also paints like a sandbox. It’s just about addictions.
Linux From Scratch is one thing and you can run it if you want.
I tried LFS years ago, in the end, the packages won because finally, we have to do anything that spends endless hours writing “./configured to make -c” make install””
Perfect fit, LFS is for amateurs
It’s not just a case of progression groups with limited resources to publish software with less testing and support a wider diversity of platforms, it’s a progression philosophy!
Waterfall is a sequential lifecycle model of Liner, while Agile is a continuous iteration of progression and testing in the software progression process. Agile means that they will eventually succeed (hopefully) with enough time and resources (and probably a lot of retrospectives). Snaps allow technical debt equivalent to poorly thoughtful responses or limited resource groups that implement unfinished software. Can you believe that the design team of a center and lung device company is iteratively implementing a device that only wants to paint and is in a technical debt position?
For some reason, no developer will agree with the above.
Last time I checked, snapshots don’t prolong attributes and therefore can’t work under many trendy Linux security systems.
If the add-in software component includes all the libraries in which you want to run the program and all systems have their own copy of the libraries, what happens when a serious error is detected in one of the libraries? If you have libraries in one place, simply refresh the library. If you have fired with a personal copy of the library, you now want to update all your shots to make sure you have removed that error.
Well, at least LD_PRELOAD still works, however, the target arc of your library and the adjustment arc will have to match, and an adjustment can freely have any older arc. Then, when running a locally corrected library, the system, as usual, has become less trivial.
Easy on Ubuntu and fully recommended: uninstall snap (https://askubuntu.com/questions/1035915/how-to-remove-snap-store-from-ubuntu), block instant installations (https://askubuntu.com/questions / 75895 / how-forbidden-a-package-specific-to-install), install flatpak, the flathub repository and the gnome-software plugin (https://medium.com/@uncertainquark/how-to-install-flatpak-apps- in ubuntu-20-04-lts-6c3f632cc605). Be happier.
This Chromium trick is annoying because Chromium installed does not have the actual registry formula and cannot simply save downloaded records.
I installed genuine Chrome from a deb record and continued my life.
Packaging all dependencies for each package is not only useless due to incredibly higher disk usage, but will also create a massive remediation challenge in case of security vulnerabilities. Suppose there is some other critical vulnerability in a popular library such as OpenSSL. With the classic package, Ubuntu can also simply launch an updated openssl package for all supported versions (possibly 3 or four versions at a time) and all applications will automatically use the updated library. Now, each snap-based application will come with its own libraries and there will possibly be many other snap packages that contain a vulnerable library and all will want to be updated manually through their developers.
Yes. And all of this exacerbates those security disorders in practice, because it encourages upstream projects to lazy projects to integrate new versions of libraries… or even to create libraries.
Just say no.
That’s precisely what LMDE – Linux Mint Debian Edition is. Mint’s efforts to create the same user delight but without the Ubuntu layer in between. It’s your plan B if Canonical made it too difficult to unscrew the Ubuntu base.
Try it: https://www.linuxmint.com/download_lmde.php
Well, it’s time to locate and install a new distribution, I guess.
“The instant packer installs itself from a patented Canonical-specific source. If you distribute snapshots, you want to create an account with Canonical and host it there.”
That’s not true: you can download an instant registry from anywhere and install it with the ‘devmode’ option. We distribute our app that way, and it works great. I don’t like the concept of blocking either, but Snap as a formula doesn’t require it.
We’ve already had issues with AppImage (and other package systems) that don’t run on some distributions or distribution versions. And although the snapshots are huge, it means that everything the end user wants is in an installation program. From a progression point of view, it also means that all users run the app with exactly the same debug versions, so we see fewer errors.
Snap already lost me as a user 10 seconds after installing it and I saw that it had just moved to/snap in the root directory.
This already says all about your quotes with Unix traditions.
Snap uses much less area than the article suggests. The article assumes that Snap occupies the amount of disk area reported across the fixed path, but in general, it is much smaller, because Snap is very compressed. The values shown are uncompressed sizes, which never succeed in your logging system. This is explained in the Snap https://snapcraft.io/docs/system-snap-directory
I’m afraid you misunderstood this page in the documents. The length of the registry formula symbol is accurate, which is larger is the directory in which it is mounted.
In the case of youtube-dL, the downloaded instant register is 92 MB and the extended registration formula has an obvious 285 MB.
Is that why clichés start so slowly?
In the end, Canonical is a business and you will need to attract a wider user base that only commands ninjas online. Although there is a sacrifice, my total circle of family members can install software on ubuntu. I’ll never have to fix things again. For the foreseeable future, there will be distributions for those who have the ability to build from sources. I don’t see that disappear with other people’s teams like readers here. For me, it is natural for Linux to evolve into easier and clumsier versions for the general public and distributions for demanding users without luxuries to drive development.
Static binding only means that you put the object code of your libraries in your binary, rather than retrieving it externally from a shared library (dynamic binding). All Snap does is organize dynamic libraries with the app, so you don’t have to rely on the operational formula to supply them.
Assume that there is a security vulnerability in a non-unusual library that is used through your application. If it is statically linked, the only way to fix the vulnerability is to rebuild the application with a new edition of the library, distribute it, and then ask users to update their application. With dynamic libraries, the concept is that all a user has to do is extract updates from their package manager. No action is required by the application administrator (in the maximum case).
And if I perceive it correctly, it’s a weak point for both snap and flatpak or AppImage. There are some programs where those libraries will never be updated, because the administrator simply doesn’t. That’s an inconvenience. What remains as a merit in my opinion are the sandbox features that most people consider annoying and that pose a genuine challenge with some programs (for example, Firefox/Chrome cannot communicate with KeePassX in my case).
The main explanation for why I don’t like instant packages are the additional security needs they impose. It provides you with annoying Windows-style pop-ups when you record logs. (You know, security pop-ups are so common that they make you click on them without reading them.) These pop-ups in the Chrome add-in are absolutely redundant with the Chrome log backup conversation box.
In addition, it can be difficult to access the files outside the user folder. I had to give up the VSCode plugin because I couldn’t replace anything in my root/work point folder despite the fact that I owned the folder and all the files. This also made the app conversion practical.
The policy around snap is 0 for sure, but there are enough usability reasons for it.
In marketing, they are “assumed” to be the first consumers. Here, instant packages are more like the convenience of developers first and consumers at the end.
So far, I haven’t noticed a sufficient explanation for why to switch to an “instant” flavor application architecture.
Ubuntu and Canonical died for me for this. Never come back.
I love Linux, (Ubuntu) and I wouldn’t use it if there wasn’t a click. New Linux users do not know how to use the device.
Lol When you install software from the Software Manager application, you probably don’t even know or care if you install packages from a repository or snap (or a flatpak that integrates 100 percent the same as Snap in Software Manager). And he doesn’t care as a new user, but he cares how he values loose software.
We’re in Debian!! :RE
Instead of creating editions for Ubuntu supported edition, they can release a non-married add-on software component that will adapt to them.
Well, yes, but no. Snap has its own features, and an older snap may not be able to release new snap packs.
Very clever article, still missing some points:
1. Snaps are advertised to make packaging less difficult for developers. That’s right, creating APT packages is a rather confusing and cursed procedure (for example, insisting on updating registry entries and edit updates to check edits). But the correct solution is to update the APT packages, but to facilitate their construction.
2. Satisfying a variety of distributions with an instant package eliminates that diversity. If the distribution edit comes with the same snapshot set, they are no longer separated. Snapshots are not the right solution, it would be to reduce diversity. If the most recent edition of a distribution is still stable, there is no need for “LTS” editions. Make your progression cycle physically more powerful and reliable!
3. To get the latest versions of the packages in a previous distribution, there are PPA, Personal Package Files. Ubuntu accepts them (always) well, even with a build server, and uploads some to the local installation if it works incredibly well. They simply integrate into the normal upgrade process, with no load issues.
A common thread on the desktop Linux network is that no one can agree on anything. It’s so fragmented with other tactics of doing anything that discourages developers and users. It seems to me that if Snaps is rejected through many Linux distributions, this becomes another challenge rather than a solution.
Other geeks have Terminal to install things and after so many years we have shops. Why are they jealous? No one Ubuntu terminal or APT! Snap is just a smart way to install things for non-geeks like us, so why don’t they need it to exist? Do you need to be the only type of user using Linux?
In fact, they are Ubuntu APT packages (as the article explains). And from the user’s point of view, clicking on an APT package is pretty much the same procedure as clicking a Snap package.
Snap is advertised and designed as a convenience to developers and that’s it: you don’t get benefits for users, even the App Store that existed before.