Skip to main content

UC Berkeley student studying applied math and cybersecurity. I like to build stuff, and I talk about it here. See my resume and my work.

Thunderbolt Networking Is Underused

Thunderbolt Networking Is Underused

I've been trying to keep up with local AI hosting, and I've noticed a bunch of Mac Studio cluster demos that have been going around, where people daisy chain Mac Studios together with Thunderbolt 5 cables for AI work, pooling the unified memory. While watching, I couldn't help but be reminded of the 2 thunderbolt 4 ports on my laptops, so I decided to try my own version. Surprisingly, It's been so good I've been using it every day for the last two weeks.

Why I tried it

1. Two laptops, different strengths

My MacBook Air is my daily driver because it's light, fanless, and has a ridiculously great battery. But it's got 16 gigs of RAM and no dedicated GPU. My other laptop is an i9-13900HX with an RTX 4070 and 64 gigs of DDR5, but it's heavy, the fans are loud, and the battery lasts about an hour if I'm generous. I've been using them together for a while now, just with ssh, which is great but leaves a lot to be desired, especially when being physically close to the laptop.

2. Nobody seems to use it this way

Thunderbolt networking has been around since 2013 when Apple added Thunderbolt Bridge to macOS. But every time I see it discussed it's either NAS setups for video editors or enterprise clustering. I've never seen anyone talk about just connecting two personal laptops and using it as a daily thing.

3. It costs $20

I grabbed a cable from Microcenter for 20 bucks. I'm sure you could get a shorter one for cheaper, but I wanted at least a meter.

The setup

MacBook Air M5 (2026) - Apple M5, 10-core CPU/GPU, 16GB RAM, Thunderbolt 4

Gaming Laptop (Linux) - i9-13900HX, RTX 4070 Max-Q, 64GB DDR5, Thunderbolt 4 (Maple Ridge)

On the Mac you just plug in the cable and Thunderbolt Bridge shows up in System Settings Network. I set a static IP of 10.0.0.1/24.

On Linux, the kernel creates a thunderbolt0 interface automatically:

bash
sudo ip addr add 10.0.0.2/24 dev thunderbolt0
sudo ip link set thunderbolt0 up

Made it persistent with systemd-networkd, and after bracing for a multi-hour debugging session like the Nvidia driver thing from my Omarchy post, I was pleasantly surprised that it (mostly) just worked.

Speed tests

I was even more surprised after running tests, because I love networking, and I never though I would have such high speed connections so easily.

iperf3

Raw TCP throughput, single stream, 10 seconds:

Linux to Mac: 15.9 Gbps

Mac to Linux: 18.8 Gbps

15.9 and 18.8 Gigabits per second, with no retransmissions on the upload side.

The Mac pushes data faster than it pulls., which I would assume that it's something about how Apple's bridge driver handles buffering. My WiFi tops out around 150 Mbps on a good day, so this is about 100x faster.

I also tried 4 parallel streams and got the exact same 15.9 Gbps, which means a single stream already maxes the link out. The bottleneck is the Thunderbolt controller, not TCP.

What personally made me happy to see was the consistency. Obviously a wired connection is more consistent than wireless, but that graph is a flat line for the full 10 seconds.

Bidirectional

Traffic both ways at once:

[TX] 0.00-5.00 sec 1024 MBytes 1.72 Gbits/sec sender
[RX] 0.00-5.00 sec 10.9 GBytes 18.7 Gbits/sec sender

About 20.4 Gbps total. The download side got priority, which makes sense because Thunderbolt 4's 40 Gbps is shared bandwidth, not 40 per direction.

Jumbo frames

Quick note because I went down a rabbit hole on this. The Mac's Thunderbolt Bridge defaults to MTU 9000 (jumbo frames), while Linux was at 1500. Bumped Linux to 9000 to match and it made basically no difference. 15.8 vs 15.9 Gbps. It seems that the thunderbolt controller is the bottleneck rather than packet overhead. But I could be wrong, if you could get better results by using jumbo frames, reach out to me because I would be interested in how you did it.

Latency

This is probably the most useful result for actual daily use.

Thunderbolt:

rtt min/avg/max/mdev = 0.142/0.299/0.435/0.065 ms

Tailscale (WiFi path):

rtt min/avg/max/mdev = 24.294/39.601/68.759/12.074 ms

Obviously this is ridiculously good. It's good to the point where I've setup cloud gaming servers like Sunshine and Moonlight and there is absolutely no noticeable latency.

File transfers

SCP a 1GB file:
- Linux to Mac: 2.23 seconds (460 MB/s)
- Mac to Linux: 2.24 seconds (460 MB/s)

Almost perfectly symmetrical. The 460 MB/s is bottlenecked by SSH encryption (AES-256-GCM), not the link. NFS or something unencrypted would get closer to the 2 GB/s that the iperf3 test has.

rsync 1,000 small files (10KB each):

real 0m0.324s

A thousand files in a third of a second, not too shabby.

SSH latency

SSH command round-trip (including key exchange):

- Thunderbolt: 147ms
- Tailscale: 445ms

The handshake dominates both numbers so the gap looks smaller than the raw ping test. But tools like VS Code Remote and SSHFS that do tons of small round-trips benefit a lot from it.

How I've been using it

I added an alias to my SSH config pointing at 10.0.0.2 from my mac, and 10.0.0.1 from my gaming laptop. I also have a script on the mac that auto switches the alias from tailscale to the static ip when I disconnect the cable.

The thing I like most is treating the MacBook as a thin client. On the go it's a normal laptop for school work, but when I sit at my desk and plug in the cable, I've got access to all 64 gigs of RAM, the RTX 4070, 24 CPU cores, Docker, all my dev environments.

I've also been mounting my Linux project directories on the Mac via SSHFS when I want to use macOS tools on files that live on Linux. The latency is low enough that IDEs don't complain about it.

GPU offloading has been fun too. My MacBook doesn't have an NVIDIA GPU, but the Linux machine does, so I SSH in and run ollama or llama.cpp and interact with local LLMs from a terminal on the Mac. Same for ML training. The link is fast enough that I never notice.

Limitations

1. You need a cable. Both laptops have to be physically next to each other. When I leave my desk I lose the connection. Fine for me since the Linux machine is basically a stationary workstation, but worth noting.

2. SCP is limited to 460 MB/s by SSH encryption. The raw link does 2 GB/s but you're not seeing that with encrypted protocols. Haven't set up NFS yet.

3. Not all USB-C cables are Thunderbolt cables. Make sure you get an actual Thunderbolt 4 cable. They're $12-20 now, but don't just grab a random USB-C cable from a drawer, which is what I initially tried.

What's next

I want to try NFS instead of SSHFS to get closer to the raw 2 GB/s, and want to figure out how much the link actually constrains GPU offload workflows.

Thunderbolt networking has been around for over a decade and every Thunderbolt port on every laptop from the last five years supports it. I'm not sure why more people don't use it. Maybe it's just because nobody talks about it outside of server use. Apple calls it "Thunderbolt Bridge", and Intel tried to turn it into a premium product with Thunderbolt Share, but at the end of the day it's just IP networking over a cable, and it's really fast.

Thanks for reading - feel free to reach out if you have questions or if you've done something similar.

Read more ⟶

Reviving A 13-Year Old Macbook

Reviving A 13-Year Old Macbook

A CD Reader. A physical battery indicator. Firewire???



Although it seems like these features came down with the Berlin Wall, believe it or not, a laptop with all those features is what I'm using to type out this very post.



This weekend I undertook a journey to revive an old MacBook Pro. This specific model is a 15-inch, Mid 2012, MacBookPro9,1. It took a lot of frustration, hitting myself in the head, and Halloween candy, but it's finally up and running - and if I do say so myself, it is running beautifully







Why even revive it?

1. Sentimentality
I've wanted to fix up this laptop for a while. My mom's old job went bankrupt several years ago, and they could only pay her salary with tech from the office. My very first experiences with laptops were through those macbooks she brought home, so each one I can repair and continue to use means preserving the tech which ultimately gave me a path in life. This is the second one I've refurbished - the first one I fixed in December of last year (2024), just replacing the extremely swollen battery, and that one is a 2015 model that I currently use as a school laptop. That one has good enough specs to get good battery life and take care of any school work (and even a lot of dev work), so I want to give the same treatment to this 2012 laptop - which I can hopefully find a good use for.
2. It's got surprisingly good specs
I almost couldn't believe my eyes when I saw that this laptop had... an Nvidia chip? I knew that old macs had Nvidia chips before there was some sort of falling out, but seeing it with my own eyes after never seeing an dedicated gpu in a mac was really surprising. I'll go over the performance of the gpu later (and the pain of getting drivers working), so let me quickly go over the less shocking specs - Intel Core i7-3820QM @ 2.70GHz (quad-core, 8 threads), integrated Intel HD Graphics 4000, and 8gigs of DDR3-1600 MHz RAM.







1. Good ol' hype
It's a day that ends with "day", which means that there's some new tech in the dev world generating hype. The past few weeks, that piece of tech has been Omarchy, a browser created by Ruby on Rails creator David Hansson, who is known mostly commonly online by his initals - dhh. I've wanted to try Omarchy for a while, and this gives me the perfect excuse to do it - try running any modern software on a Mac this old, and you'll quickly run into one of Apple's many notorious business practices - planned obsolescence. Planned obsolescence is essentially the intention of making something fail to get the consumer to buy a new piece of tech. Not only is this horrible for the environment, but it's really expensive for consumers, so being able to bypass Apple's planned obsolescence via software locks on older versions of hardware by using a new OS is something I definitely want to take advantage of.

The revival

Disclaimer - most of this revival process is actually software based. The hardware on this laptop is honestly fine - I think I might change the thermal paste, but other than that nothing has degraded - even the battery is fine.



It took me a little bit over 12 hours to get Omarchy up and fully functional on the Macbook, which for a Linux distro isn't horrible, but for Omarchy is unusually long.



Dreaded Drivers



Omarchy advertises itself as a opinionated version of Arch, one that works out of the box instead making you go through the infamously long arch installation process. And actually, the Omarchy installer pretty much worked fine, until I got to the most dreaded part of setting up a Linux PC - Nvidia drivers.



What ended up happening was a process that involved reinstalling Omarchy 3 times. At first, I booted it up and was met with a pitch black screen. After a lot of confusion, I eventually realized that there was something wrong the video drivers. Eventually I used nomodeset to bypass the drivers and enter a low performance version of the OS. After changing some boot options to use the Nvidia drivers, I rebooted and... nothing again. Except this time nothing I changed would load back into the OS. In changing the drivers I essentially locked myself out, and had to reinstall.



This time I made sure everything was right. Turns out I was using drivers that were too new. Omarchy is great that they try to give good Nvidia support, but my card is so old (NVIDIA GeForce GT 650M) that the drivers didn't support them. I uninstalled the new drivers and isntalled the old ones from AUR (arch user repository), rebooted and... blackscreen. I had locked myself out again.



So, I reinstalled, and this time I did a bit more research and used nouveau, which is essentially an open source driver set that recently got improvement to the Kepler chipset, which the GT 650M is a part of. After cleaning up the Nvidia Modules, regenerating the initramfs (inital ram filesystem) and the UKI (unified kernel image), I rebooted and man - it worked.



First Impressions of Omarchy

After seeing the Nvidia card being used, seeing how snappy Omarchy had made the laptop blew my mind. I should note that this isn't exclusive to Omarchy, or even Arch. Loading Linux onto any old tech is a great way to banish the lie that companies tell you - that you constantly need better and better hardware as time passes. No, rather, you need better and better hardware to run THEIR software. 90% of the tasks you do everyday can be done on tech from 10-years ago with absolutely no issues - as mentioned earlier, I daily drive a 10 year old macbook pro for all my school work, and some of my dev work, and I have no issues.



Anyways, moving on from that rant... Yeah, Omarchy lives up to the hype. having hyprland preinstalled is great and intuitive as I've used tiling WMs in the past. It's got a bunch of preinstalled software, and a great software installation experience. In fact, I'm writing this blog post on Typora, a preinstalled md editor.



The theme system is also great. I'm just using the default theme right now, but in the future I plan to make my own with custom pixel art screensavers and wallpapers. Should be fun, stay tuned for that!



Performance

On idle, Omarchy uses about 1.2gb out of my 8gb of ram. CPU and GPU usage are almost at 0 on idle, which is fairly standard for an Arch linux system.

Gaming

I quickly installed Minecraft just to get a good idea of how good the CPU's performance is. It booted up surprisingly fast, but I was getting varied performance. I hadn't played 1.21 yet, so I'm assuming the vibrant visuals update was causing the discrepancies due to the... vibrant visuals. I was getting about 50 fps looking at the ground or sky, but only about 15-25 looking straight ahead in a forest.



The performance doesn't sound too great, but I'm actually quite happy with it. With some optimization mods I should be able to get well over 60 fps, and beyond that, I'm probably not going to be using this laptop to play games like Minecraft too much - I'll probably be playing even less intensive, maybe retro games. Additonally, the games I DO play on this laptop will probably be using the GPU more, which goes unused in Minecraft.

Multitasking

I was suprised to see how good multitasking was - this laptop only had 8gb of ram, but in my use I haven't hit the limit - although, truth be told, I'm treating this like a laptop with 8gb, so I'm not going crazy and spamming chrome tabs just yet. Here's a Btop view of my resources - this is with Typora, 11 loaded tabs open in Helium (Chromium), and Minecraft.



Minecraft is definitely the resource hog here, but it's not like I'm unable to multitask - it works pretty well, and I'm confident I could daily drive this for school work - just maybe not too much dev work.



Overall Impressions



A lot of love and hate has come out about Omarchy has come out, and honestly both sides make mostly true claims. Yes, this is a great distro for reviving old tech, but that's universal across almost every flavor of linux. Yes, Omarchy works great out of the box, but it is essentially just someone's personal config and preferences on top of Arch Linux.



I think the value of this distro comes if you don't want to spend too much time on your OS. Ironically, I ended up spending a whole bunch of time getting Nvidia drivers to work, but regardless, I can definitely see the value of this as a quick switch for devs who feel disenfranchised with both MacOS and Windows.



Yes, the Arch puritans say that it's essentially a dotfile repo, and while it's a gross exaggeration, they definitely have a point. However, not everyone wants to be the stereotypical arch user who spends 90% of their time tweaking their OS. I love the hyprland config that comes with Arch, and all the other great desktop software included. The themes system is also great, and I look forward to getting more into the preinstalled gaming features. Overall, I can wholeheartedly recommend Omarchy - if you've never used a tiling WM or aren't great with memorizing keyboard shortcuts, it will definitely have a bit of a learning curve, but I promise it will be worth it. It's hard to describe, but it really does have a "snapiness" to it.



What's Next?

A good question to ask here is "What's the point?" Honestly I'm not too sure yet, but I had a great time reviving this laptop. One use case that might be interesting is turning this laptop into a mobile entertainment/retro game system for flights or road trips. I've yet to test the battery though, and that's what will really decide what it gets used for. I'll post an update at some point after I stress test it a bit more.



Limitations

In reaching it's teenage years, this laptop has definitely been through a lot. Here's a list of all the things currently that either don't work, or that bug me.

1. Glass is cracked (display is mostly fine)
2. Two columns of pixels will flash on and off, green and red respectively. Very thin, but undoubtedly a distraction if watching something. Not a big concern when doing school work - I barely noticed it while typing this out, but that could be because of Typora's white background.
3. A lot of the keyboard is broken - the U, 7, and M keys are the ones I have identified - they either dont type anything, or after hitting them enough spit out some string of characters. Not exactly sure what is causing this, but the keyboard will probably have to be replaced if I'm to use it on the go. In typing this out, and setting up the laptop, I've had to use an external keyboard.
4. Keycaps are missing - the I and up arrow keys are missing. I don't really care too much about the appearance, but it would definitely improve the typing experience, and TUI navigation - hitting that up arrow is so annoying without the cap!)

What you should notice is that all those limitations are hardware related. The software and spec's are all thing's I'm quite happy with - I mean, camera and sound worked out of the box. Amazing job to the Omarchy team - it's a rare feat on Linux, especially when it's running on old hardware!







Conclusions

Overall, this process was one I can write about with no regrets. I learned a lot about Linux, a lot about Nvidia drivers, and a lot about Macbooks. More than that though, i got to save a laptop from turning into ewaste, which is something I hope everyone can do. We have so many devices that are needlessly thrown out, and each plug and play Linux distros like Omarchy are great ways to get people to recycle and get computing power at the same time.



I'm very happy with the results, even if it took a lot of friction to get their. I'll keep testing the laptop and posting updates - thanks for reading! Feel free to reach out if you have any questions, or similar issues.

Read more ⟶