when running models locally, I presume the models are trained and the weights and stuff are exported to a "model." For example Meta's LLama model.
Do these models get updated, new versions released? I don't quite understand
when running models locally, I presume the models are trained and the weights and stuff are exported to a "model." For example Meta's LLama model.
Do these models get updated, new versions released? I don't quite understand
wow 10 months flew by since this was posted and since then the United States had a surprise privacy bill that is bipartisan that sort of addresses the issues you and I mentioned. https://www.washingtonpost.com/technology/2024/04/07/congress-privacy-deal-cantwell-rodgers/
This bill was proposed around the same time the TikTok ban was announced. I speculate that law makers had a difficult time framing the arguments against TikTok when "the data of citizens have no protections so there was no easy legal grounds to forbit the likes of TikTok to harvest it"
From what I've heard, this bill is pretty good. I need to educate myself more on it, however.
was it ever? I participate in interview rounds at my company (several tech screens a month) and I must say a candidate's email was not something that drew attention
you're able to unsubscribe from all those protomtions . . . that is in settings. Personally, a once-a-month newsletter of everything that is new is helpful bc I don't need to put in the effort tlinto keeping up
For backup and sync I use Syncthing. I can specify which folder on which devices I want to sync to which folder on the server.
I use a folder based gallery on my phone so when I move stuff around on my phone (or on my server) it gets replicated on all my devices.
I also have a policy to sync specified folders (and subfolder) with my family's devices. No more " hey can you send me all the pics from the XYZ trip"
We take a trip. Make a subolder for that trip in a shared folder dump all our pictures there, get home and open the folder on the computer and prune together.
simply put, programming is glorified automation. There are jobs where the process that needs automating makes money.
Debian has the advantage of not using snapd like Ubuntu does. You have to not only remove snaps but also instruct the package manager not you pull in snaps as dependencies and not to favor snap packages.
I have fond memories of Ubuntu being my first distro many years ago but pushing snaps onto users to compete with flatpak is a nuisance.
hey, that's what the internet is for; information sharing :)
for the dummies (like me) that can't read the room, especially online, a sarcasm tag /s goes a long way 🙃
you sound like a Microsoft engineer ;)
hahaha good point.
That colleague, keep in mind is a bit older, also has Vim navigation burned into his head. I think where he was coming from, all these new technologies and syntax for them, he much rather prefers right clicking in the IDE and it'll show him options instead of doing it all from command line. For example docker container management, Go's devle debugger syntax, GDB. He has a hybrid workflow tho.
After having spent countless hours on my Vim config only to restart everything using Lua with nvim, I can relate to time sink that is vim.
Recently I used Google maps to search for the nearest DHL near me so I could return a package. DHL is not that popular near me and when I specifically typed for DHL, I would get only their competitors in the search results.
There was a DHL service center near me and I had to scroll a bunch to find it. Oh, and apparently big box stores (or anyone) can pay Google to come up in the search on maps, even if unrelated.
I don't think they have skin the in shipping game but their algorithms are over optimized that they don't even show what your searching for, but trying to infer why you're searching for it. That or whoever pays them more. Certainly a search risk