×
all 44 comments

[–]dinerburgeryum 116 points117 points  (0 children)

“Besides privacy?” excellent summation of our entire digital experience right now. 

[–]SeeonX 49 points50 points  (2 children)

Porn is a perfectly rewarding way to learn how to use AI models and gain understanding of prompting.

[–]Wide_Egg_5814 19 points20 points  (0 children)

I mean Google images was not developed to look up images of cats

[–]Figai[S] 4 points5 points  (0 children)

I didn’t say it wasn’t, the sheer amount of amazing models that have come out of beaverAI and theDrummer and more is testament to that. It’s not niche though lol.

[–]emonshr 49 points50 points  (0 children)

Doomsday scenario, bad internet, internet outage, impractical internet cost, safeguarding trade secret etc.

[–]FullstackSenseillama.cpp 22 points23 points  (0 children)

How about building the skills and know-how to run models locally?

APIs are only cheap now because they're heavily subsidized. The moment the free money dries up, expect API costs to skyrocket similarly to how hardware prices have. Thing is, even if you can access hardware at reasonable prices, you'll still need the know-how of how to build a good machine that can run larger models for a decent price and how to setup the software stack to run those models.

You see it on this sub all the time, people throwing a ton of money on consumer hardware and then hitting wall after wall with compatibility or bottlenecks despite spending a pretty penny. I'm sure in ten years we'll have low cost turnkey inference solutions, but in the meantime, we'll have to learn how to build balanced systems depending the hardware we can find.

[–]ThinkExtension2328llama.cpp 35 points36 points  (3 children)

Unlike what scam Altman / Elon cuck says , medical and genetic data can be used to optimise answers locally in a secure and safe environment using medgemma.

[–]Figai[S] 2 points3 points  (2 children)

Could I ask what your workflow looks like when working with genetic data? I’ve never thought of that! Might make that DNA test I did a while back more useful that telling me I might be lactose intolerant.

[–]ThinkExtension2328llama.cpp 4 points5 points  (0 children)

Genetic not DNA sequences, aka genetic heritage for me. It means when I’m trying to debug my own body for fitness and health the information is optimised for me not the general public.

[–]ThinkExtension2328llama.cpp 0 points1 point  (0 children)

I had a think about this and a DNA sequence may work but would need preprocessing. Effectively you would get a sequence then capture all the known types in it and feed that to the RAG (not the raw sequence). But say you had a FOO BAR gene and what this means for you. Id almost be tempted to try and make it a MCP.

[–]basxto 10 points11 points  (3 children)

I can run them with solar power :)

[–]Figai[S] 1 point2 points  (0 children)

Ooh fancy…

[–]Waarheid 0 points1 point  (1 child)

Would love to hear more about this :-)

[–]danieldhdds 0 points1 point  (0 children)

you deploy your off grid solar system, than plug your pc in it

[–]charles25565 17 points18 points  (0 children)

To just mess around :)

[–]z_3454_pfk 8 points9 points  (0 children)

for summarization, basic VLM capabilities, ocr, etc local is easily much cheaper to run. esp at scale.

[–]ConstantinGB 8 points9 points  (0 children)

Actually understanding the technology that is - for better or worse - one of the major shifts for the short and long term future of our species. And you only learn to understand it with a hands on approach.

[–]DrNavigat 5 points6 points  (0 children)

If the only reason was privacy, it would already be worth every penny invested.

[–]Your_Friendly_Nerd 10 points11 points  (4 children)

I added a token counter feature to my code completion plugin.

The other day, I used 1'000'000 tokens in little under an hour and I didn't even use a single suggestion.

It didn't cost me a dime.

[–]Citadel_Employee 0 points1 point  (3 children)

How does it come out after you factor in electricity and hardware depreciation? I’m trying to figure that out for my own setup.

[–]DarwinOGF -2 points-1 points  (2 children)

Hardware depreciation is a lie. Electricity doesn't count because I would use it anyway. Also, solars.

[–]BankruptingBanks 4 points5 points  (1 child)

You would use the elctricity anyway? Do you know how electricity works?

[–]danieldhdds 0 points1 point  (0 children)

he said that even if he doesn't use LLM he would the same amount of eletricity in another things, maybe gaming or watching a movie

[–]nntb 2 points3 points  (0 children)

Cost? Flexibility? Ownership? Fitting the theme of the subreddit?

[–]HopePupal 2 points3 points  (0 children)

here's a "besides privacy and porn": censorship. i don't want my coding model to sass me because it thinks i'm writing malware. fully managed cloud models are always going to have Some Bullshit.

scenarios where you control the entire software stack and are just paying for someone else to run it less so, but there's a lot of overlap between the skills you need to do that and the skills you need to run local anyway

[–]graymalkcat 1 point2 points  (0 children)

How about avoiding surprise API bills?

[–]woolcoxm 1 point2 points  (2 children)

imagine the porno ai could generate. im very interested <3

an adventure where you meet alien life forms and hook them up to dildo machines. yay !! good use of my time.

definitely dont want that leaking on the internet. privacy.

[–]Figai[S] 2 points3 points  (1 child)

Average localllama enjoyer

[–]woolcoxm 1 point2 points  (0 children)

Gotta know what you want out of ai... I mean life

[–]brickout 1 point2 points  (0 children)

My own edification and planning for unreliable internet

[–]Durgeoble 1 point2 points  (1 child)

cost, the cost of local use is far far less than subscriptions.

[–]piggledy 1 point2 points  (0 children)

What do you need to pay to get the performance of a subscription locally?
In other words, how much do you have to spend upfront to run a SOTA open model like GLM-5 at good speeds (and decent precision level)?

[–]Mayion 0 points1 point  (1 child)

porn?

[–]lenjet 4 points5 points  (0 children)

Linux ISOs*

[–]Lixa8 0 points1 point  (0 children)

Privacy is already so nice, you can copy paste your api keys, passwords and whatever into the chat without having to worry about it (respecting least privilege ofc).

[–]jtackman 0 points1 point  (0 children)

Cost (if you don’t buy hardware to do it)

[–]KrazyKirby99999 0 points1 point  (0 children)

Reliability of access

[–]StillVeterinarian578 0 points1 point  (0 children)

Not giving my todo-list to a third party

[–]BobbyBobRoberts 0 points1 point  (0 children)

"Privacy" also applies to a ton of business and client confidentiality that cloud models may not work for. That's not a small thing.

[–]Apprehensive_Sock_71 0 points1 point  (0 children)

Latency and bandwidth sensitive tasks can benefit a lot. You can use something like Frigate that sends things to a multimodal model for image classification which can set off a local automation. If the process were totally cloud based then you are talking about several hops back and forth to a data center that can degrade user experience.

[–]One-Employment3759 0 points1 point  (0 children)

Because eventually, when everyone is hooked and completely dependent on the cloud models, they will ramp up the price 10-100x to recoup their investments and capex.

[–]Karnemelk 0 points1 point  (0 children)

Image editing/generation. use whatever photo to prompt it to modify whatever you want, without sharing it with the whole world as training data. Even when paying for subscriptions I wouldn't upload anything

[–]DisjointedHuntsville 0 points1 point  (0 children)

Use Gemini on AI Studio -> Use Gemini in the Gemini App.

The big model providers neuter the models served to you on their platforms. You have no way of knowing if they suddenly cut the thinking budget on Gemini3 down to zero in the app or whatever, because THEY DON'T ALLOW YOU TO SEE IT OR MAKE ANY GUARANTEES.

You're guaranteed a consistent quality of intelligence when you run your models locally.

[–]Stunning_Macaron6133 0 points1 point  (0 children)

LoRAs, hello?