Cascade du Flumen Waterfall

Apple Intel Macs and FireWire

Reading Time: 3 minutes

If you bought a Mac Book Pro in 2007 or so you could get it with firewire 800, Thunderbolt 1 and a number of other connectors. A few years later they did away with every connector except USB-C, for charging and devices. The result was a thin laptop that needed dongles, and breakout boxes, for everything.

If you have a garmin device to charge you need an adaptor. If you have a firewire 400 or 800 drive you may need two or three dongles to have the right connections. If you have a lightning cable you need a dongle, to charge one Apple device with another.

## Apple Abandons Products too

Google gets a bad reputation for killing Google Wave, Google Reader, Google+ and other products, rather than allowing them to live despite being less popular. Apple does the same thing, except with hardware that you are theoretically stuck with.

I got the idea for this blog post after reading the 9to5 mac article [“Got an Intel Mac? Here’s the deal so far, and possible future scenarios”](https://9to5mac.com/2024/06/19/got-an-intel-mac-future/). If you bought a 2018-2019 Intel Mac Book Air you’re unlucky. The reason for this is that Apple will not support it for as long as you expected. It’s going to be obsolete within a shorter amount of time, at least when running MacOS.

“If you have the budget to make the switch this year, it’s a no-brainer to do it.”

I disagree with this sentence. I don’t think it’s a no brainer, for two reasons. The first reason is that macs are expensive, and limited. You need a dongle to do almost everything with current Mac Book Air and Pro. Why would you want this?

## Experimenting with Linux is Free

I believe that the most interesting option is to experiment with Linux on the “end of Apple life” machines for one reason. It’s free. I’ve been using macs since 2003-2004, to today, or at least this year, and we’ve seen g3, PPC, Intel and M Chips. Not only do they jump from connector to connector to find the flavour of the leap year but they jump from chip architecture to chip architecture, and if you’re unlucky you get caught at the end of one chip type, just as it switches to the new one, and that expensive upgrade needs to happen sooner than desired. Paradoxically this is why you get the cheapest mediocre machine, rather than the more expensive top spec machine. By buying a mid range device you can afford to slide along to a newer device more often.

>… the main reason to upgrade to macOS 15 is going to be the Apple Intelligence features – and those need the power of Apple Silicon. …If you have the budget to make the switch this year, it’s a no-brainer to do it.

I disagree with this assessment. The first reason for this is that although a huge song and dance is being made about large language models and AI there is the opposite side of the playground. Small language models that will run on a Raspberry Pi, for example, or lower spec, older macs. If you want to play with AI you don’t need top spec machines. You need a language model that is suited to your needs and to the processing power of your current machine.

## Experimenting with Small Language Models Locally

Apple tells us that to play with their AI tools we need to buy either an M series processor or an A17 iPhone 15 Pro or Pro Max but that is not true. We can experiment with small language models on the hardware we have now. It won’t do as much, but we don’t need it to do everything. We need to find the model that does what we need it to do. It’s [discussed my Morten on Linkedin Learning](https://www.linkedin.com/learning-login/share?forceAccount=false&redirect=https%3A%2F%2Fwww.linkedin.com%2Flearning%2Fusing-lightweight-ai-with-small-language-models%3Ftrk%3Dshare_ent_url%26shareId%3DWs0JRLKRQH%252BwIGaNKR0Vfg%253D%253D).

If you look at [Models on Huggingface](https://huggingface.co/models) you can find models that are more specialised, to do what you want to try to do, using fewer resources. Apple Intelligence is easier to use, but it’s also more expensive to implement because you need to upgrade your phone to the 15 pro or 15 pro max and you need an M chip laptop. You can [run Phi3](https://pimylifeup.com/raspberry-pi-ollama/) with relative ease.

## More Accessible Than We Think

The key point, made by Morten, and possibly others, is that we don’t need to beat ChatGPT 4.0. We can use smaller, specialist models that can run on much lighter systems. In so doing we reduce costs, both environmental and financial, but we also lower the barrier of entry to play with Machine learning technology.


Posted

in

,

by

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.