Indeed, AKA the OpenAI playbook.
Indeed, AKA the OpenAI playbook.
As per usual, in order to understand what it means we need to see :
Still interesting to read after announcements, as per usual, and especially who will actually manufacture them at scale (SMIC? TSMC?).
It’s a classic BigTech marketing trick. They are the only one able to build “it” and it doesn’t matter if we like “it” or not because “it” is coming.
I believed in this BS for longer than I care to admit. I though “Oh yes, that’s progress” so of course it will come, it must come. It’s also very complex so nobody else but such large entities with so much resources can do it.
Then… you start to encounter more and more vaporware. Grandiose announcement and when you try the result you can’t help but be disappointed. You compare what was promised with the result, think it’s cool, kind of, shrug, and move on with your day. It happens again, and again. Sometimes you see something really impressive, you dig and realize it’s a partnership with a startup or a university doing the actual research. The more time passes, the more you realize that all BigTech do it, across technologies. You also realize that your artist friend did something just as cool and as open-source. Their version does not look polished but it works. You find a KickStarter about a product that is genuinely novel (say Oculus DK1) and has no link (initially) with BigTech…
You finally realize, year after year, you have been brain washed to believe only BigTech can do it. It’s false. It’s self serving BS to both prevent you from building and depend on them.
You can build, we can build and we can build better.
Can we build AGI? Maybe. Can they build AGI? They sure want us to believe it but they have lied through their teeth before so until they do deliver, they can NOT.
TL;DR: BigTech is not as powerful as they claim to be and they benefit from the hype, in this AI hype cycle and otherwise. They can’t be trusted.
Very cool, sincere thank you for the clarification and even on-boarding process. Installed this way, feels quite efficient. Will dig a big deeper while using them more.
Ah! Wonderful. I’m always a bit reluctant with system-wide install so I’ll put AM on hold for now but probably tinker with AppMan/dbin soon.
Out of curiosity, one of the app I’d usually get outside my package manager is Chromium. I’d usually download the latest build from https://download-chromium.appspot.com/ so in this situation, how would you do it using any of those solutions? Would it support adding extensions e.g https://chrome.google.com/webstore/detail/immersive-web-emulator/cgffilbpcibhmcfbgggfhfolhkfbhmik that I need for development?
PS: note to self, go through bash history to see which failed apt install
attempts could be replaced with such tools.
Also this is a good way to re-consider integration back, e.g. generating .desktop files for /.local/share/applications/
when using KDE rather than having to manually do it each time.
Hmmm very interesting thanks for the links and explanation!
I’m not “ready” for it yet so I’ve bookmarked all that (by adding a file in ~/Apps ;) but that’s definitely and interesting, and arguably neater solution.
Honestly I try to stick to the distribution package manager as much as I can (apt on Debian stable) but sometimes it’s impossible. Getting binaries myself feels a bit “wrong” but usually works. Some, like yt-dlp as I see in your list, do have their own update mechanisms. Interesting to consider stepping back and consider the trade off. Anyway now thanks to you I know there are solutions for a middle ground!
I forgot the exact number but while installing Debian (Bookworm and Sid) this weekend I was shocked by how small the base install, with a window manager (“big” one by your standards, i.e KDE), was. Maybe 2Gb, definitely less than 4Gb. It all worked fine, I could browse the Web, print, edit rich text, watch video, etc.
I installed a ton more stuff since, e.g Steam, Inkscape, Python libraries for computer vision, etc and it’s still not even 10Gb.
So… my suggestion is the same as I shared earlier in https://lemmy.ml/post/20673461/13899831 namely do NOT install preemptively! Assuming you have a fast and stable connection I would argue stick to the bare minimum and all add as you need.
In fact… if you want to be minimalist I would suggest to do another fresh install (it’s fast, less than 1hr and you can do something else at the same time) and stick to the bare minimum right away.
TL;DR: don’t get rid of, just avoid adding from the first place.
Another “trick” I use is having an ~/Apps directory in which I have AppImage, binaries, etc that I can bring from an old /home to a new one. It’s not ideal, bypassing the package manager, and makes quite a few assumption, first architecture, but in practice, it works.
I did more than 5 installs this weekend (for … reasons) and the “trick” IMHO is …
Do NOT install things ahead of actually needing them. (of course this assume things take minutes to install and thus you will have connectivity)
For me it meant Firefox was top of the list, VLC or Steam (thus NVIDIA driver) second, vim as I had to edit crontab, etc.
Quite a few are important to me but NOT urgent, e.g Cura (for 3D printer) and OpenSCAD (for parametric design) or Blender. So I didn’t event install them yet.
So IMHO as other suggested docker/docker-compose but only for backend.
Now… if you really want a reproducible desktop install : NixOS. You declare your setup rather than apt install -y
and “hope” it will work out. Honestly I was tempted but as install a fresh Debian takes me 1h and I do it maybe once a year, at most, no need for me (yet).
I must express myself quite poorly. It is not a point about technical knowledge, in fact if you were to know more about the topic than I do, I would expect you to even more be upheld to higher standards and thus not promote a bad solution, even more so assume it’s the only one. I can’t imagine that even a PhD student who is supposedly at the frontier of knowledge in their very narrow field would assume no alternative is possible, or will ever be. This even more the case without having both a complete understand of the landscape but also about OP’s actual needs, which is probably hard to express clearly and thus leading to a lot of assumption. Here maybe a simple loud alarm from a BT speaker going out of range might be enough.
My whole point is that abandoning hope, and leading others to do so, is worst than actively finding for a good compromise.
Anyway I don’t want to invest more energy on this discussion unfortunately so simply wishing you the best, thanks for the clarifications.
They asked for an alternative to airtags. I provided one.
And even though I’m not OP I’m genuinely grateful for that.
Doesn’t matter if they were compromised because like I said, everyone is eventually.
No! That’s the whole point of this Privacy community! If someone is using, using home automation as an example, Apple HomeKit or Roomba or Google Home they will eventually get compromised BUT if they are using something local, e.g Zigbee with HomeAssistant they WILL never get compromised because by the very local only architecture of that solution no data is leaving the home and thus can NOT be compromised.
The ENTIRE reason d’etre of this community is not to say “Oh well… the default solutions are imperfect, we have to shrug and accept the statu quo” but rather provide genuinely alternative.
I understand a lot of people can enter into a learned helplessness mindset imagining that only poor solutions exist and thus, better pick the least worst one, but by doing that we are giving power to Big Tech, surveillance capitalism, etc.
Please do NOT say that “everybody gets compromised” when you actually mean that “the vast majority of people who accept to use a popular solution with trade offs that are not good for privacy”. It sounds like a finicky difference but it’s actually totally different because it shows that it’s not inevitable.
By taking shortcut in your language you limit what’s conceived as possible by others who are asking for help, again, in a Privacy focused community.
True yet still not OK.
That’s also why a lot of us do try to avoid, as much as is realistically feasible, to provide any data to any company that should store it. Hence why a lot of questions here are about self hosting, no cloud, etc. It’s not paranoia, it’s because companies cut corners and as you correctly point out, fail to keep us safe. So it’s not about Tile specifically, they are just yet another poor example. Let’s not defend them nor this kind of practices. If people in the Privacy community are OK with that, we have a rather deep problem.
Thought the same, then tried a Corne-ish Zen and … I’ve been on a strange path from normal keyboard to split kbd to Ergodox to Moonlander (so more and more) down to Corne-ish Zen (3x6) and my hands/wrist thank me daily. Even today I edited my keymap https://github.com/Utopiah/zmk-config-zen-2/blob/main/config/corneish_zen.keymap thinking “Wow… so much keys left still, I could do with less”.
I… agree but isn’t then contradicting your previous point that innovation will come from large companies if they only try to secure monopolies rather than genuinely innovate? I don’t understand from that perspective who is left to innovate if it’s neither research (focusing on publishing, even though having the actual novel insight and verifying that it does work), not the large companies… and startups don’t get the funding either. Sorry if you mentioned it but I’m now confused as what is left.
They just provide the data. They can question the methodology or even provide another report with a different methodology but if the data is correct (namely no fabricated) then it’s not up to them to see how it’s being used. The user can decide how they define startup, i.e which minimum size, funding types, funding rounds, etc. Sharing their opinion on the startup landscape is unprofessional IMHO. They are of course free to do so but to me it doesn’t question the validity of the original report.
Neat.
Warning disclaimer : I’m not a cryptographer.
I actually tinkered with https://github.com/open-quantum-safe and it’s actually quite simple to become “post-quantum” whatever. The main idea being that one “just” have to switch their cryptographic algorithm, what one uses to encrypt/decrypt a message, from whatever they are using to a quantum-resistant (validated by NIST or whomever you trust to evaluate them) and… voila! The only test I did was setting up Apache httpd and querying that server with Chromium and curl, all with oqs, while disabling cryptographic algorithms that were not post-quantum and I was able (I think ;) to be “safe” relative to this kind of attacks.
Obviously this is assuming a lot, e.g that there are not other flaw in the design of the application, but my point being that becoming quantum-resistant is conceptually at least quite simple.
Anyway, I find it great to demystify this kind of progress and to realize how our stack can indeed, if we do believe it’s worth it now, become resistant to more threats.
Research happens through university, absolutely, and selling products at scale through large companies, but that’s not innovation. Innovation is bringing new products, that is often the result of research yes, to market. Large companies tends to be innovative by buying startups. If there are no startups coming from research coming from universities to buy, I don’t see how large companies, often stuck in the “innovator dilemma”, will be able to innovate.
Thanks for linking to criticism but can you highlight which numbers are off? I can see things about ByteDance, Ant group, Shein but that’s irrelevant as it’s not about the number of past success, solely about the number of new funded startups. Same as the CEO of ITJUZI sharing his opinion, that’s not a number.
Edit: looks totally off, e.g “restaurants, in a single location, such as one city, you could immediately tell that there were large numbers of new companies.” as the article is about funding, not a loan from the bank at the corner of the street.
HP Laser 107w, driverless, over LAN.
I just Ctrl+P from any software and it prints.
It also prints programmatically (for e.g. folk.computer ) thanks to IPP.
I didn’t have to “think about printing” since I have that setup so I don’t know where you get that sentiment.