• 0 Posts
  • 6 Comments
Joined 3M ago
cake
Cake day: Feb 05, 2025

help-circle
rss

Yes, that’s correct. You download an msi installer, install Office, and you have fully functioning Microsoft Office on your desktop without an internet connection. They say that they’ve removed the ability for you to save locally, but I can’t see how that’s possible so I highly doubt there’s no fallback.

It’s not a cloud solution.


I’m not sure if people are purposefully being ignorant, but this shit is crazy…

There’s a serious difference between having an entire application in a cloud environment (office365) and an entire application on your local PC (office) with the inability to save locally… That’s not a server/client setup.

For the life of me I cannot understand why everyone here is being seemingly as disingenuous as possible. It’s honestly fucked up.


You’re using cloud based storage. That’s not the same as having the entire application in the cloud…

This is a completely disingenuous argument.



This is the point everyone downvoting me seems to be missing. OP wanted something comparable to the responsiveness of chat.chatgpt.com… Which is simply not possible without insane hardware. Like sure, if you don’t care about token generation you can install an LLM on incredibly underpowered hardware and it technically works, but that’s not at all what OP was asking for. They wanted a comparable experience. Which requires a lot of money.


What kind of hardware do you need to run with comparable responsiveness to chatgpt?

Generally you need between $8-10,000 worth of equipment to get a relative responsiveness from a self-hosted LLM.


Anyone downvoting clearly doesn’t understand the hardware requirements to be able to run an LLM with a significant model that rivals ChatGPT. ChatGPT is a multi-billion dollar AI cluster…

OP specifically asked what kind of hardware you need to run a similar AI model with the same relative responsiveness, and GPT4 has 1.8 trillion parameters… Why would you lie and pretend like you can run a model like that on a fucking raspberry pi? You’re living in a dream world… Offline models like that require 128 GB of RAM which is $900-1200 in RAM alone…