"Federal immigration agents bashed open a door and detained a U.S. citizen in his Minnesota home at gunpoint without a warrant, then led him out onto the streets in his underwear in subfreezing conditions, according to his family and videos reviewed by The Associated Press."
Blaise reshared this.
"I’m a dictator,” Trump continued. “But sometimes you need a dictator!..."
If he lets the cat all the way out of the bag, do Trump and his fans still get to use the, "but it's out of context" argument?
Blaise reshared this.
Rogue's is back among the living!
#HashtagGames
#CatPetPeeves
That rodent attached to your desktop.
Click mousie. Drag mousie. Kill mousie.
Radio Free Trumpistan reshared this.
like this
reshared this
#HashtagGames
#MentorAMovieOrPlay
Creed IV: lessons learned from Rocky Balboa the ways to avoid when you deserve to retire.
Radio Free Trumpistan reshared this.
#HashtagGames
#MentorAMovieOrPlay
Help Wanted: translator for Lost in Translation
Radio Free Trumpistan reshared this.
#HashtagGames
#MentorAMovieOrPlay
Know this: you don't need no steenkin badges!
#BlazingSaddles
Radio Free Trumpistan reshared this.
#HashtagGames
#MentorAMovieOrPlay
What Cyrano de Bergerac should have done instead of impersonating that guy.
Radio Free Trumpistan reshared this.
#HashtagGames
#MentorAMovieOrPlay
Avatar: learning The Way Of Water
Radio Free Trumpistan reshared this.
#HashtagGames
#MentorAMovieOrPlay
Benji--goes to obedience school
Radio Free Trumpistan reshared this.
#HashtagGames
#MentorAMovieOrPlay
Bob and Carol and Ted and Alice at the head of the classroom
Sylkykat (she/her) 📚🖖🦉🐱☕️🇺🇦 reshared this.
#HashtagGames
#MentorAMovieOrPlay
Stand And Deliver on your final exams!
...which is what that movie was actually about, so-- #NoEffort
Sylkykat (she/her) 📚🖖🦉🐱☕️🇺🇦 reshared this.
Well I was there and I saw what you did
I saw it with my own two eyes
So you can wipe off that grin
Know where you've been
It's all been a pack of lies
And I can feel it comin' in Gruyere tonight--oh Lord
I've been waitin' for this moment, all my life, oh Lord...
#PhilCollins
Radio Free Trumpistan reshared this.
Unus Nemo
in reply to liv • •@liv
Right 😀 Good luck to him. Real life does not work that way. I have seen the quality of AI generated code. It is worse than the code being generated by sweat shops in India, and that is very bad. AI is great for providing an intellisense feel to Vim, code completion. But it cannot be trusted to write applications.
Besides, why would I want to stop doing something I love to do? Telling me good news, you can now stop writing code is sort of like telling the average American good news, you can now stop watching TV 😉 Good luck with that!
liv likes this.
liv
in reply to Unus Nemo • •Totally agree though — AI code quality is still rough. but for me
if it could handle chores like cooking and cleaning and other everyday boring tasks, that would be real progress. kek.😇
Unus Nemo likes this.
Unus Nemo
in reply to liv • •@liv
lol, give it time. I use only local AI and train the LLMs myself. Though, even then I would not trust code produced by the AI just yet in any case. The quality of the material you train the LLM with will definitely show in the results. My LLMs will produce far better code than most commercial AIs that are trained on anything they can scrape from the internet. Though, I do not focus on training my LLMs to code. I mainly train them in philosophy and psychology and nutrition 😉
liv likes this.
liv
in reply to Unus Nemo • •Unus Nemo likes this.
Unus Nemo
in reply to liv • •@liv
You will not be able to train a base model with it, but you can train with a RTX 5060ti it will just take a while, and of course you will have to train small models. To train larger models I am afraid you have to spend a lot more. At the least you would need a RTX Pro 6000 with 96gb of ddr 7. Which will set you back around $8500.00 dollars. Though that is what is required for serious training.
You can do a lot with a consumer grade gpu these days, though serious AI gpu for commercial use (above the RTX Pro 6000) start at around $16,000.00 which is a bit steep for a hobby 😉.
liv likes this.
liv
in reply to Unus Nemo • •Unus Nemo
in reply to liv • •@liv
I train models for my research. I do not monetize my efforts at this point, and may never. Typically, my trained LLMs will be part of a project I am working on. Such as making NPC more realistic in an MMRPG or to answer questions on brewing 😉. My main goal in life it to enjoy life. So I spend a lot of time researching many aspects of life.
If you have a bit of IT skills then you could download llama.cpp which is not affiliated with Meta by the way.
If you have never used git before then start with these basic steps.
Create a fork of the project, to do this you need to make an account on github if you do not already have one. This is not technically required and if you do not intend to modify the code in any way then you can just clone their project directly. You will not be able to sync your local changes with the project though.
Cl
... show more@liv
I train models for my research. I do not monetize my efforts at this point, and may never. Typically, my trained LLMs will be part of a project I am working on. Such as making NPC more realistic in an MMRPG or to answer questions on brewing 😉. My main goal in life it to enjoy life. So I spend a lot of time researching many aspects of life.
If you have a bit of IT skills then you could download llama.cpp which is not affiliated with Meta by the way.
If you have never used git before then start with these basic steps.
Create a fork of the project, to do this you need to make an account on github if you do not already have one. This is not technically required and if you do not intend to modify the code in any way then you can just clone their project directly. You will not be able to sync your local changes with the project though.
Clone your fork of the project to your system. You will need git installed of course.
Review the building instructions, if you do not have an adequate GPU then you could compile it using the CPU. As long as you only use models that will fit into your memory of your computer you will be fine. Just CPU is a lot slower than using a GPU though I have done it on many occasions on laptops with poor GPUs.
There is a user friendly wrapper around llama.cpp which is called Ollama If you just want to play with local AI it is a great way to go. You could check both out and see which one fits your needs best. Options are a great part of life 😁
liv likes this.
liv
in reply to Unus Nemo • •Unus Nemo likes this.