- cross-posted to:
- apple_enthusiast@lemmy.world
- cross-posted to:
- apple_enthusiast@lemmy.world
There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.
I have a macbook air m2 with 8gb of ram and I can even run ollama, never had ram problems, I don’t get all the hate
maybe in a browser using external resources. open some chrometabs to feel the pain. apple is a joke.
here you are
vscode + photoshop + illustrator + discord + arc + chrome + screen recording and still no lag
so not a single cool app and yet you own a computer
Which model with how many parameters du you use in ollama? With 8GB you should only be able to use the smallest models, which ist faaaar from ideal: