

This is amazing. Very interested in picking one up


This is amazing. Very interested in picking one up


It looks and feels a bit like windows with the theming it has out of the box. So it’s probably an easier on ramp and possibly recommended in “what Linux is most like windows” google searches and the like.


More like, large corporations not at all invested in local communities are now empowered to completely run rough shod over local governance processes. They’re actually more likely to pay for folks to stall out slow approval processes so that they can take advantage of this law and start building, especially when the permit would have likely been denied because it didn’t consider easements, fire or flood risks, building and local regulatory standards, or any other manner of things. So this actually increases the likelihood of bribes, and ensuring that corporations actually pay less to your local government and more to personal pockets of those being bribed, while simultaneously making the buildouts less safe and compliant with greater risks to the local community. Basically a lose lose for local folks, and a win win for a giant corporation.
A better version of these flawed tactics would’ve been that failure to meet timelines would open the project to public vote and also that every project would require a public option (eg government supplied bid on the infrastructure) to compete. That way if timeline expires, it’s not automatically awarded to people who have a vested interest in it expiring at the expense of a community. It could be awarded to a local municipal project instead.


I think RHEL9 uses 5.14 as base


I think it’s okay, I made a comment about the license first! It’s good discussion. I certainly like everything being copyleft, but I also get why people who make a contribution (an extension or otherwise) might want to license it differently. Ultimately whoever does the work gets to decide on the license — closed source I’ll never touch, extension or otherwise, but I’m lenient on open source.


Oh I’m fine with copyleft, even preferred. I just see any open source license, even MIT, and am pleased. Perhaps my bar is too low, but at this point anyone posting anything with open source protection to the creation is cool to me.


Hey some folks responded here which is great! For me, I think wiki and tracker are perfect like someone else mentioned, because a lot of folks without accounts can still access the knowledge created. The hard part is moderating of course. I’m not sure there is a perfect solution.
Ultimately, you’re producing something cool for the community and you get to set the terms for that; if discord is easy and sustainable, I prefer that to you doing anything else that isn’t sustainable to see the project through as long and vibrantly as you can. So in that sense just choose what makes sense.
So in short: do what makes sense for you and if one of the alternatives listed (maybe wiki it seems? That would be cool with me) works then that’s great!
I guess I’ll also plug forgejo or codeberg at this time haha
Edit: I’ll also say, more folks here for discussion is cool too, and good to have you posting and hope to see more discussion around it in the future here!


MIT license, cool! I’ll check this out. Any chance to migrate from discord to a more open platform for community engagement?
Servo is a web browser rendering engine written in Rust, with WebGL and WebGPU support, and adaptable to desktop, mobile, and embedded applications.
Essentially it is an alternative to chromium based web browser engines. The other (major) web browser engines are WebKit for iOS and Gecko for Firefox. You can see a list at Wikipedia.


I think they just don’t know. They just search for OpenOffice perhaps and it comes up. I think I had actually looked before installing libreoffice. At this point Apache should just archive OpenOffice and redirect to libreoffice.


TL;DW: Apache OpenOffice is not actively maintained. LibreOffice is. Both have heritage in the same original (non Apache) OpenOffice.


It supports the thesis that Lumo is not open source in many common sense ways that most people would expect when a model claims it is open source. So in that sense, it does though.


deleted by creator


It’s a way to watch content you may have otherwise needed a tv antenna or cable box for


Oh interesting. I’ll give gdm a try and see if that gives any joy. Thanks for this tip, will return tomorrow with update on this particular change
Update: alas, no dice for this change.


Prusa Core One. Can buy it prebuilt or as a kit.
Disclosure: I do not have one. I have a creality k1 and it’s mostly great for me, but it isn’t perfect and I personally would buy a prusa if I was buying a new printer.
Yeah, I think I’ll go with proxmox as a first attempt — it seems to fit what I’m looking for and the feedback here has been pretty positive on that front. My main concern now is figuring out how to provision the hdds so that a jellyfin lxc can utilize it, nextcloud could use it, and I can save (configuration) backups to it. I’m comfortable with zfs in general (run that on my desktop), but I was under the impression that raid10 would be more performant with the same redundancy, when using 4 disks in raid10. Any one disk could fail, writes are at the speed of the disk because of mirror, and reads are 2x. I lose usable disk space, but I think 16tb is enough for me (for now of course haha). Am I wrong though on the zfs vs raid10? I guess actually I could use zfs, create a single pool with two mirrored vdevs. I am not sure how that would affect future growth, but should do really well for now. Does that sound like a reasonable thing to do, in your opinion?
Amazing, thank you! I think I’m gonna have to be okay with not nailing it on the first go and trialing it out the next few days. Step one sounds like proxmox to me :)
Hey, thanks so much for the response, this is great! Love the idea of offloading ai workloads to their on vms to make facilitating managing resources easier.
Also, big thanks for the recommended software — very helpful list for me to look through, especially on the AI front. Do you have any notes on configuration for those in particular?
You can get by surprisingly well on 20b parameter models using a Mac with decent ram or even 8b parameter models that fit on most high end (eg 16gb) model cards. Depends on your use cases but I almost exclusively use smaller local models.