Shared posts

31 Oct 02:13

The iPhone 14 Is Much Easier To Repair

by Rui Carmo

This is very, very good. In short, the internals were completely redesigned to make it easy to repair from both the front and the back, which puts an end to almost a decade of nearly inscrutable hardware assemblies.

I love it. It completely flips the usual narrative of right to repair on its head, and puts the ball squarely on its competitors’ camp.

And am going to love it even more next year when it comes with an USB-C port.


26 Sep 04:10

Fastly Compute@Edge JS Runtime

Fastly Compute@Edge JS Runtime

Fastly's JavaScript runtime, designed to run at the edge of their CDN, uses the Mozilla SpiderMonkey JavaScript engine compiled to WebAssembly.

Via phickey on Hacker News

26 Sep 04:10

No mates

by russell davies

I enjoyed this discussion on Woman's Hour about male friendships. Based on Billy No-Mates.

"Men treat women like their HR department"

"Men don't have friends. They have wives who have friends who have husbands"

"The best way to have a friend is to be a friend. Men don't put the effort in"

"Show up when you're asked. Go first when you're not. Keep going even when it's hard"

 

 

 

26 Sep 04:10

Amsterdam's Roetz develops a futureproof, modular 'e-bike for life'

by Liesbeth den Toom
26 Sep 04:08

Introducing LiteFS

Introducing LiteFS

LiteFS is the new SQLite replication solution from Fly, now ready for beta testing. It's from the same author as Litestream but has a very different architecture; LiteFS works by implementing a custom FUSE filesystem which spies on SQLite transactions being written to the journal file and forwards them on to other nodes in the cluster, providing full read-replication. The signature Litestream feature of streaming a backup to S3 should be coming within the next few months.

Via Hacker News

26 Sep 04:07

20 Years of Rogue Amoeba

In 2022, Rogue Amoeba is going stronger than ever. Every day, our audio tools help countless Mac users create podcasts, enhance video calls, and so much more. […] But as the date of our twentieth anniversary approaches, it’s nice to take a few minutes to review the past and reflect.

Paul Kafasis and Quentin Carnicelli, two (out of three) founders of RA, were among the first indie Mac developers I met when I was starting out. I remember Paul talking on numerous occasions that the goal was not to create a single great app, but to create a great and sustainable company. And from that, great apps would emerge.

It's indisputable they reached that goal. Congrats to the whole crew over at Rogue Amoeba on 20 years.

26 Sep 04:05

Do You Need a Fancy Bluetooth Speakerphone to Make Your Zoom Meetings Better?

by Brent Butterworth
Do You Need a Fancy Bluetooth Speakerphone to Make Your Zoom Meetings Better?

As if freeing up printer jams and troubleshooting Wi-Fi problems weren’t enough, audio engineering has now joined the list of frustrating technical tasks that office workers are expected to master. With many meetings now combining employees in a conference room with colleagues on Zoom, office workers who can barely figure out their home stereo often struggle to get everyone coming in loud and clear. The causes are manifold: cheap webcam microphones, noisy conference rooms with terrible acoustics, and most annoying of all, TED Talk wannabes who wander the room when they speak. What should be a productive conversation can turn into a frustrating exercise in lip reading, clairvoyance, and disambiguation. Did you just agree to proofread a contract for the boss—or to take care of their basset hound for a week?

Dismiss
26 Sep 04:04

How to set up Python on Compute Engine

by kchodorow

This is a followup to my previous post on setting up big files on GCP. I ran into similar problems with Python as I did with static files, but my solution was a bit different.

The right wayTM of running Python on GCP seems to be via a docker container. However, adding a virtual environment to a docker container is painful: for anything more than a small number of dependencies and the docker image becomes too unwieldy to upload. Thus, I decided to keep my virtual environment on a separate disk in GCP and mount it as a volume on container startup. This keeps the Python image svelte and the virtual environment static, both good things! It does mean that they can get out of sync: technically I should probably be setting up some sort of continuous deployment. However, I don’t want spend the rest of my life setting up ops stuff, so let’s go with this for now.

To create a separate disk, follow the instructions in the last post for creating and attaching a disk to your GCP instance. Make sure you mark the disk read/write, since we’re going to install a bunch of packages.

Start up the instance and mount your disk (I’m calling mine vqgan_models, because sharing is caring).

On your development environment, scp your requirements.txt file over to GCP:

gcloud compute scp requirements.txt vqgan-clip:/mnt/disks/vqgan_models/python_env/requirements.txt

Here’s where things get a little tricky, so here’s a high-level view of what we’re doing:

  1. Create a “scratch” Docker instance.
  2. Add our persistent disk to the container in such a way that it mimics what out prod app will look like.
  3. Install Python dependencies.

Virtual environments are not relocatable, so we need to make the virtual environment directory match what prod will look like. For instance, I’ll be running my python app in /app with a virtual environment /app/.venv. Thus, I am going to mount my persistent disk to /app in the scratch docker container:

docker run -v /mnt/disks/vqgan_models/python_env:/app -it python:3.10-slim bash

This will put you in a bash shell in a python environment container. Everything your create in /app will be saved to the persistent disk.

Note: when you want to leave, exit by hitting Ctrl-D! Typing “exit” seemed to cause changes in the volume not to actually be written to the persistent disk.

Now you can create a virtual environment that will match your production environment:

# Shell starts in /
$ cd /app
$ python3 -m venv .venv
$ . .venv/bin/activate
$ pip install -r requirements.txt

Hit Ctrl-D to exit the scratch docker instance. Shut down your instance so you can change your docker volumes. Go to Container -> Change -> Volume mounts and set the Mount path to /app/.venv and the Host path to /mnt/disks/vqgan_models/python_env/.venv.

On you development machine, set up a Dockerfile that copies your source code and then activates your virtual environment before starting your service:

FROM python:3.10-slim
WORKDIR /app
COPY mypkg ./mypkg
CMD . .venv/bin/activate && python -m mypkg.my_service

Build and push your image:

$ export BACKEND_IMAGE="${REGION}"-docker.pkg.dev/"${PROJECT_ID}"/"${BACKEND_ARTIFACT}"/my-python-app
$ docker build --platform linux/amd64 --tag "${BACKEND_IMAGE}" .
$ docker push "${BACKEND_IMAGE}"

Now start up your GCP instance and make sure it’s running by checking the docker logs.

$ export CID=$(docker container ls | tail -n 1 | cut -f 1 -d' ')
$ docker logs $CID
...
I0917 01:24:51.654180 139988588971840 my_service.py:119] Ready to serve

Now you can quickly upload new versions of code without hassling with giant Docker containers.

Note: I am a newbie at these technologies, so please let me know in the comments if there are better ways of doing this!

26 Sep 04:03

60 Cycles

by Michael Kalus
60 Cycles

On your marks! Follow cyclists from 13 countries as they cover 2 400 km of Gaspé countryside in 12 days—a course longer than those of Italy, Belgium or Spain. The long shots of curving landscape and open road are set to a mesmerizing soundtrack in this documentary, and the results are spellbinding. Directed by Jean-Claude Labrecque - 1965

26 Sep 03:54

Questions I have been asked about photography

… and some answers I have given which may or may not help you.

I haven’t been coding much, instead I’ve been enjoying nature photography, and sharing it on twitter. It’s why I haven’t been blogging here as much as I used to.

As a result, I find myself being asked about photography far more than I am about coding, and so much like with coding, I decided to write up my answers into a longer essay.

In some ways, it’s a return to form. I didn’t start this blog to talk about computers, I started this blog to capture the discussions I was having, to avoid repeating myself. It just so happens I talk a lot more about cameras than computers now.

I know this isn’t about programming, but I do hope that some programmers may find this useful.

Anyway:

What’s the best way to learn photography?

Practice.

I’m sorry, but it’s true. You need to abandon the idea of taking perfect photographs and give in to the numbers game.

Now, there’s some photographers who like to spend three hours measuring light before taking a picture, there’s other photographers who like to spend three hours processing negatives before making a print, and there’s always someone who enjoys using a computer and post-processing the fuck out of a picture. That’s fine and good, but it’s not the best way to start out.

Yes, these are all useful skills to have, but at the end of the day, good photography often comes down to luck, as much as it comes down to preparation, experience, and equipment. If you get one good shot out of every ten, then you’ll get a lot more good shots if you take a couple of thousand pictures.

So grab a camera and just take a bunch of shitty photos. If you don’t have fun sucking at it, you’ll never invest the time to be good at it.

What’s the best camera for a beginner?

The one you’re carrying.

Sometimes this means buying a camera you’ll want to take with you. A camera you enjoy using is always going to be better than a camera you leave at home, no matter the specs. This is how I and many others have justified a more expensive piece of equipment, but if it makes you happy, and you can afford it, why not?

On the other hand, mobile phones are incredibly good point and shoot cameras, and will continue to get better. I have a lot of fancy digital and film cameras but I still use my phone for those everyday moments.

A phone camera is a great camera, and anyone who tells you it’s not “real photography” is one of those people who bought a camera to take photos of, rather than photos with. A collectable over a tool. Dorks.

Will a better camera improve my photos?

Theoretically, yes. In practice, not significantly. I’ve seen incredible photos from disposable cameras, and marvels from phone cameras. I’ve also seen some of the most dull pieces of shit out of the most expensive gear.

A good photo has good lighting first and foremost. A good photo has good framing too. A good often photo has some emotion, expression, or action, and sometimes a good photo was taken on a good camera.

There are exceptions. If you want to do astrophotography, you’ll want a camera that works in low light. If you do nature photography, you’ll want a camera with snappy autofocus. If you do portrait work, you may end up picking the lens you want and then settling on a camera that fits.

Even so, you’ll still need to get the hang of lighting, framing, and expression before you need to worry about equipment.

What about video?

Ask a video person.

All I know is that most stills cameras overheat very quickly if you’re shooting video, that, and things like managing focus breathing, or autofocus tracking speed become very important very quickly.

In other words: Although you can shoot video on a stills camera, they vary greatly in how good they are at it.

I do have one piece of advice for those of you using a fancy camera as a webcam:

If you’re annoyed by your glasses reflecting light in your webcam (or showing everyone on zoom that you’ve switched tabs), see if you can find a circular polarizing filter. They rotate in place and eventually you’ll find the right adjustment to hide reflections.

So… what camera should I buy?

It depends. [Audience sighs]

You buy one that makes you happy, and I can’t begin to guess what that might be.

It is worth nothing that price, weight, and size usually factor into that decision, but there are a lot of different cameras out there, and frankly, there is no “magical best camera”, film or digital. Some cameras are better at video, some are better at stills. Some cameras are great at low light, some are tiny little marvels, and some are huge gigantic slabs of metal that have broken out in a pox of knobs, dials, and switches.

The other thing worth noting is that with some cameras, you’re buying into a system—so you should factor in the cost of alternative lenses into your calculation. Even if the body is cheap, it doesn’t really matter if all the lenses you want are out of your price range.

If you’re looking for video, or lightweight, micro four thirds can be a good choice. M43 is especially good for nature photography on a budget. If you’re wanting vast amounts third party lenses, and great autofocus, Sony might be for you. If you’ve already got a bunch of Canon or Nikon lenses kicking around, then the choice will already be made for you. On the other hand, if you’re wanting a good all-round camera that’s not too big, not too expensive, and gets you excellent JPEGs out of the box, Fujifilm might be just right.

It really does depend.

Ok, ok, what about the lens?

Lenses come in two basic types. Zoom and Prime lenses. A zoom lets you, well, um, zoom in and out, and change the framing without having to move around, and a prime, well, you gotta move your feet if you don’t like what you’re seeing.

Primes are usually better at low light, usually lighter than a zoom, and often are better at autofocus and overall image quality. Some people prefer using prime lenses, and some people prefer the convenience of having one or two lenses to cover a whole range of potential shots. For example, Nature people tend to like zooms, street and portrait people tend to like primes, but there’s no hard or fast rules about which lenses you have to use.

Usually the more expensive the lens, the sharper it is, and the better at low light it is, but sometimes you will be just be paying for a brand name. Lenses by the same manufacturer as the camera usually work a lot better than third party ones, but not always. Once again, it depends.

All lenses, zoom or prime, have a focal length (or a range of lengths) which gives you an idea of the angle of view you’ll get. Unfortunately, once again, it depends. The actual angle of view you get from a lens depends on how big the sensor is on a camera.

This is why i’ll be talking about “35mm equivalent” or “full frame equivalent” focal lengths, so if you’re thinking about a micro four thirds camera or APS-C camera (smaller sensor), the numbers might look a little different.

Anyway:

A lens below 20mm is a “super wide” lens, and they’re great for night sky, landscape or architecture shots. A 14mm lens makes a tiny cupboard shaped room feel big and breezy, and they can be great for crowd shots at events too.

A 28mm lens is what your phone already has. It’s great for selfies. If you’re doing video, you probably want something between 20mm and 28mm, or stand quite far back from the camera.

A 28-75mm kit zoom is the sort of lens you get with your camera, use once or twice, and then buy a lens you enjoy using, after working out which focal lengths you tend to shoot at. You won’t regret buying one, and they’re often very useful for event photography.

A 35mm, 40mm, 50mm (full frame equivalent) lens is a solid investment. Most camera makers have a cheap 50mm and you really won’t regret buying one. A so called “normal lens” is good for landscapes, portraits, and casual photography.

A lens between 65mm and 135mm is really good for portraits, but they’re often much bulkier and more expensive than shorter lenses. You can get zoom lenses that cover 70-200mm and they’re usually quite good for event photography.

Any lens longer than 200mm is for nature photography. You probably want a zoom lens like 100-400mm or 200-600mm if you’re serious about taking pictures of birds. The longer the lens, the heavier the lens, and eventually you’ll end up lugging around a tripod just to keep the damn thing stable.

It’s a minor note, but technically these big lenses are called “long lenses”. A telephoto lens is one where the focal length is longer than the actual lens itself. You also can get a variety of special purpose lenses, like fisheyes, soft-focus lenses, or even things like “smooth trans focus” lenses. You probably don’t need to worry about any of this, but it is worth mentioning.

Anyway: If you don’t know, buy the kit lens (a 28-75mm), and a ‘nifty fifty’ 50mm. You’ll work out the rest as you go along. We’ll get to what those f-numbers on lenses mean in a little moment, but if you’re impatient, scroll down, I’m not a document cop.

Do I shoot in Auto? JPEG? Raw?

Auto is fine. JPEG is fine. Raw is often overkill, despite all the posturing men online saying otherwise. You have my permission to ignore any and all advice about photography online, including the advice here.

There is one thing worth mentioning though: White Balance.

Digital cameras don’t always know warm or cold to make a photograph look. If you’ve ever tried taking a picture at sunset and wondered why everything turned out blue, then white balance is what’s caused it.

The problem with JPEG and Auto is that it doesn’t always get things right, and in some cases (weird lighting), it can get things disastrously wrong, and it can be very hard to edit a JPEG to fix these issues. This is why people who do weddings and events regularly shoot in RAW, they have no second chances to take a photograph, and RAW is the only way to ensure you can fix whatever went wrong.

On the other hand, if you’re not at a wedding, you can just take another shot, tweak the settings, and try again. You can just cheat and avoid white balance issues by shooting in black and white. If you have a JPEG that’s too warm or too cold, changing it to monochrome often leaves you with a better picture.

Still, it’s nice to get it right the first time around.

You can set white balance from presets (Sunny, Cloudy…) as well as setting the direct color temperature in Kelvin. You can also use a test card, or a special lens filter to configure the color temperature from a reference, if you really want to get things right.

For the most part, leaving things on AUTO will work most of the time, but it useful to know how to adjust it when things go wrong.

The other big problem with AUTO is when someone is backlit, but we’ll get on how to fix that in a moment.

I want to shoot manually.

You probably don’t.

Light meters have been around for almost as long as photography, and professional photographers use light metering to ensure the shot comes out perfectly. Your camera is just saving you some time and effort by putting in the settings the light meter recommends.

Then again, sometimes you do want to shoot manually. Sort-of.

First, we’ll have to get some understanding of what the settings do. ISO, Shutter Speed, and Aperture control how the camera handles light, and what tradeoffs it makes when there isn’t enough light to go around.

ISO is the easiest one. A lower ISO (like 50) gives a darker image, but with less noise. A higher ISO (like 3200) gives a brighter image, but far more noise as a result. If you’re using a film camera, you can pretend I said “grain” instead of noise, but it’s pretty much the same thing.

This is why all punk rock gig photos look like a grainy mess. There’s just not enough light to take a clear picture.

Shutter speed is also pretty easy to understand. If you have a slower shutter speed (like 1/30s), it lets more light into the camera, but you get more motion blur as a result. A faster shutter speed (1/2000) gives a darker, but sharper image. This is why ye-olde cameras needed people to sit still for several minutes, to gather enough light to get a decent photograph.

Aperture is a little more difficult to understand, because the scale is weird and the terminology is confusing. The aperture is the little iris inside the lens that opens up and closes down, to let more or less light inside. It’s measured in f-numbers, which go on a scale of root-2 (2, 2.8, 4, 5.6, 11, 16), but a low aperture means a large opening, and a large aperture means a small opening.

Which is why I tend to talk about an “open” or “wider” aperture (a small f-number, and a big opening) and a “closed” aperture (a big f-number, and a small opening), because I get confused when people say “a larger aperture” and I’m never sure if they mean “a larger opening” or “a larger f-number” (and thus a smaller opening).

Anyway.

An open aperture lets more light in, but gives you a smaller depth of field. Only a small section of the image will be in crisp focus, and the background will be blurry. A closed aperture lets less light in, but you get a much wider depth of field, and if you close it down enough, you can get almost everything in focus.

I find this a little counter intuitive because you let in less light to see more of the image, but physics isn’t my strong suit.

To recap:

  • ISO trades light for noise. A higher ISO is a brighter image, but has more noise.

  • Shutter trades light for motion blur. A slower shutter has a brighter image, but more motion blur.

  • Aperture trades light for background blur. A wider aperture (low f-number) has a brighter picture, but only some things will be in focus.

Sometimes we use fast or slow instead, but it’s a little confusing. A faster ISO lets more light in, and a faster lens lets more light in, but a faster shutter lets less light in. I’m sorry about that.

Any change to one of these must be reflected in the other. You let more light in with the aperture? You need to use a faster shutter or lower ISO to compensate. You use a higher iso? You need a faster shutter or to close up the aperture. It doesn’t help that each one is in a different scale. ISO goes 50, 100, 200. Shutters go 1/500, 1/250, 1/125. Aperture goes f/8, f/5.6, f/4, f/2.8.

Your camera knows all of this and can help you. This is why you don’t want to shoot manually.

If you want to shoot on automatic, but control the motion blur you can use shutter-priority, and your camera changes the other settings to match. If you want to control the depth of field, you shoot on aperture priority, and once again your camera will pick the right settings. If for some reason you want to control aperture and shutter, but not ISO, you can do that too but it’s often a little more involved.

There’s also a third mode: Programmed Auto. It’s like full auto but when you spin the dial, it lets you pick which shutter and aperture combination you want. Spin it one way and get a thin depth of field, spin it the other and avoid motion blur.

In summary: Auto is great, but Aperture Priority and Shutter Priority are great too.

If you are wanting to learn to shoot manually, it is a lot easier to try using aperture priority first, and seeing how the settings change.

How do I learn to shoot manually?

Ok. Ok.

You know how I said “all the scales are different”. Well I lied. Somewhat. They are all different, but they all measure the same thing, the amount of light the camera receives.

When you double the ISO (say 50 to 100), it can see twice as much light as before. When you double the shutter (say 1/125 to 1/250), you half the amount of light that gets into the camera. When you, uh, multiply the aperture by the square root of two (i.e f/2 to f/2.8, or f/2.8 to f/4) you double the amount of light being let in.

Thinking about doubling and halving the light is so common in photography that we have a special term for it. A stop. For example, one might say “Stopping down a lens” to mean going from f/2 to f/2.8. Going up a stop or down a stop is doubling or halving the amount of light.

Which is useful because you can’t say “double the aperture” in the same way you can say “double the shutter speed”, but you can say “going up a stop” and “down a stop” across any of the settings. So if you stop down the lens by one stop (halving light), you’ll need to open up the shutter by a stop to compensate (doubling light). Or change the ISO.

There’s often a “Exposure Compensation” dial on cameras, or a setting buried in a menu somewhere. It lets you over or underexpose a shot by a few stops, which comes in handy when you have an extra dark scene, or more commonly, you’re taking a photo of someone who is backlit.

Anyway.

How do you learn to shoot manually?

You use a light meter to measure the thing you’re going to photograph. You read off the settings it gives you, put them into your camera, and then adjust them together to ensure that the exposure remains the same.

If it sounds a lot like “Shutter Priority”, “Aperture Priority”, or “Programmed Auto”, well, you’re not wrong. For the most part, on a digital camera, and many analogue cameras, you can lean on the inbuilt light meter to do the work for you.

Unless you’ve decided to be a dork and bought yourself a fully analogue camera.

I can’t blame you, I’m that dork too. You will want to get a light meter, and you may want to learn about the “Sunny 16 rule.” Estimating the level of light in a scene is something you get better at doing from practice, and managing the exposure in a photo is one of those “real photographer skills” boring men on the internet keep banging on about.

It is useful, sure enough. Sometimes Auto doesn’t do the right thing and you need to compensate. Sometimes you want to mess around with the settings by hand to see what’s right, but taking well exposed boring shots the hard way doesn’t make you a better photographer. Hopefully you’ll have fun doing it, at least.

You’ve talked a lot about lighting, but nothing about framing, or expression

Thats because lighting is the most important detail. Framing is often more personal taste. Expression is often more being in the right place in the right time. Equipment is just something you buy to fill the void in life, telling yourself that it won’t depreciate in value.

Lighting? Lighting is the most important thing. Understanding white balance will give you the tones you want in an image. Understanding exposure and understanding stops means you can trade one quality for another, and take a little more agency over what the photo looks like.

That said: Although lighting means you can take a photograph, what makes a photograph good is highly subjective. I have some blurry ass shit photos that capture a moment, show an emotion, and I love em to pieces.

Photography is a numbers game. Good photos happen because you’re in the right place, and take a whole bunch of photographs. If you’re not happy with your photos, you simply aren’t taking enough of them.

he best way to take a lot of photos is to have fun. If this means using a cheap ass camera, go for it. If it means using some german fancy pants camera that makes funny noises, sure, burn your disposable income. If it means using the phone in your pocket, that’s great too.

It’s about having fun, that’s the secret of taking a good photo.

That said, I do have some framing advice: If you’re taking a photo of something with eyes, try and get on their eye-line to take a photo. In other words, crouch down if you’re taking a picture of a duck. It looks way better.

I want to shoot film.

I congratulate you on having a lot of disposable income.

In theory, film photography is cheap. You can pick up a crappy camera for almost the same price of a roll of film, and get it developed for the same price. This is the lie you tell yourself when you get started.

Film is always more expensive in practice. If you shoot more than 1000 shots on a decent digital camera, it’ll be cheaper than shooting on film. If you shoot one roll of film every month, after two and a bit years, you’ll be spending more per shot than you would have with digital.

Let me be clear. You don’t get into film because it’s “cheaper”, you get into film for one of two reasons.

You’re an unbearable hipster, and you’d rather shoot Portra than use a film simulation preset. Nothing I can say will change how you feel, and you probably already know which Leica and/or Hasselblad you want. Go for it. I won’t stop you. I hope you enjoy posting phone pics of the view through the waist level viewfinder, or posting videos of you loading expired film into the camera.

The other option is that you’re looking for a mechanical stim toy. In which case, congratulations, you are about to make the investment of a lifetime. There are a lot of old cheap cameras out there, and they make some incredible noises.

Now, wait. You might tell me there’s a third reason, or even a fourth. You are welcome to indulge in this cognitive dissonance, talking about “grain” you can simulate, talking about constraints you can impose yourself, but truthfully you’re getting into film for telling people you’re getting into film (the vibes), or you’re getting into film because whirr click box make good noise (the stims).

Maybe I am being a bit mean. Sure enough some people get into it because of nostalgia, or they enjoy the process. Darkroom chemistry is fun, and making something happen without a computer is kinda magical after staring at the bad screen all week. Even so, you stick around for the vibes or the stims.

Anyway, if you’re getting into film, let me give you some free advice.

You want to use 135/35mm film.

Medium format (120, not 120mm), large format (4x5 and higher), and subminiature formats can be harder to obtain, harder to process, and often more expensive than plain old 35mm film.

Black and white film is also a great place to start if you’re unsure of which film to go for. Something like Tri-X, HP5, will cover a wide variety of uses, and is substantially cheaper than color options. Slide films (Velvia, Provia, E100, etc) are much less forgiving than any other type of film. Try them out, sure, but after you’re comfortable.

The best film to start with might be XP2 Super. It’s black and white, but you can send it to any lab that handles color film (It’s chromogenic, and gets processed in C41 chemicals). XP2 can be shot at any iso between 50 and 800, and you can under and overexpose shots by a significant margin and get away with it.

It’s great for using in those toy plastic cameras you bought on eBay while drunk.

At home development doesn’t need a darkroom.

You need a tank, a changing bag, and some chemicals. Black and white film is very forgiving, unless you develop it in a DF96 monobath, in which case any slight variation of time, temperature, or the phase of the moon can give wildly different results. Really.

Don’t be tricked by a monobath. It is not simpler, it is harder to get reproducible results, and it’s more expensive than re-using fixer and stop bath in the long run. I’ve seen more rolls ruined by monobaths than I have by any other method.

Old cameras are janky as fuck, unless you buy from Japan.

The shutter won’t always fire at the right speed, and the light meter will probably be broken or not work entirely. That is unless you buy from a Japanese eBay vendor.

It’s a bit more expensive, but if you want a camera that’s been tested, cleaned, and inspected, Japanese camera vendors are at the top of their game. There are some western stores with similar quality, but they’re more few and far between.

An ideal first camera has aperture priority.

This is more of a personal opinion, but I think it’s a lot easier to shoot film when you can lean on the camera to handle exposure at first. Built in light metering can take a lot of the spontaneity out of photography, which can be a bit clumsy when you’re starting out.

You don’t want a camera with a Selenium (battery free) meter. It won’t work. You want a cadmium sensor (uses a battery) and you want a camera that takes batteries still in production.

Ideally, you’ll want a camera that accepts LR44 batteries. Cameras that take silver cells require a constant voltage, which can lead to fun results when your batteries start to fade. Cameras that require mercury batteries require an expensive adaptor, or just ignoring the light meter and hoping for the best.

Find something made after 1980 and you should be fine.

It’s worth remembering that a camera with cheap lenses is going to be more fun.

You can buy into old Nikon, Canon, or Leica gear, but you don’t have to. Minolta, Konica, Kodak, Olympus, and a number of Russian companies made as good and in some cases better equipment. If you do decide to buy a SLR with interchangeable lenses, check around for the lenses first.

It may help you pick and choose what you want. I’m hesitant to recommend any cameras directly, as I don’t want the hipsters to pick up on it. It’s nice that there’s still some affordable starter cameras out there.

Hipster cameras are often overrated. Rangefinders especially.

Rangefinder users are often about the vibes, and who can blame them. Nice small chunks of metal that are small enough to carry, and make a pleasing but subtle click as the shutter fires. The prices however are driven by collectable status, rather than practical experience.

Rangefinders suck ass when the light disappears. They’re good for normal lenses (35-50mm), but aren’t as good for very wide or very long lenses. Some rangefinders require you to focus and compose separately. Some rangefinders require you to cut the film in a specific way to load it. Some rangefinders don’t even have a nice winding crank, and the stim just ain’t as good.

On the other hand, a lot of SLRs have their bad points. Often the weight, and always the size, but if you’re looking for a more general purpose film camera, an SLR is the right choice.

I say this as someone who owns several rangefinders, enjoys them, and even nails a shot at f/0.95 now and then: If you’re getting into film, don’t get a rangefinder.

Zone focusing is the ultimate point and shoot experience.

Focusing a lens comes in many forms, and it’d be worth explaining the different kinds.

  • Guessing.

That looks about 3 meters away, so I dial 3m into the lens and pray. Sometimes it works.

  • Ground glass.

You put a bit of glass in the back of the camera, and move the lens until the image appears in focus, and then swap it out for film. Large format and some medium format cameras handle this.

  • Twin lens reflex

You put another lens on the camera, and look through it to focus the other lens. Gears are involved. They’re kinda quirky and cute, but much much bulkier.

  • Rangefinder.

There’s a little device inside the viewfinder that has an overlapping image. As you adjust the focus on the lens, it adjusts the overlapping images. When you see no overlap, that’s what the lens is focusing on. Some people love it, almost religiously.

  • Single lens reflex

There’s one lens, a mirror, and a view finder. The mirror slaps out the way when you hit the shutter. Good times.

  • Zone focusing / Hyperfocal distances.

This is basically guessing, but by stopping down the lens, you get a wider depth of field. You can assume something is either 0.7m-1.5m away, 1.5-3m away, or 3m or over, and mostly get things right.

There is something liberating at just going “fuck it” and hitting the shutter. If you just want a chill time, find a camera with zone focusing. You select “Portrait”, “Group” or “Landscape” and hope for the best. A zone focusing camera is a real fun party camera, one that other people can use without thinking.

Hyperfocal distancing is the posh version of zone focusing. Many lenses come with a little scale above the focus dial that says how wide the depth of field is at a given aperture. It’s pretty much the same deal, but you’re doing zone focusing by hand rather than a three mode selector.

Rangefinder camera owners regularly use zone focusing in order to capture shots quickly, because as it turns out, even with sufficient german engineering, manual focusing is slow.

Nighttime, indoors, and the tyranny of silver halides.

If you want to shoot at night time, or under artificial light, don’t bother trying to shoot film. I mean, you can. You buy expensive film, learn how to push film, and get a lovely punk rock grainy mess. That, or you end up being the dork blinding everyone with a flashgun.

Analogue film just isn’t great at low light shots. It’s fun, but your bad photos will look terrible, and your good photos won’t look great. I still enjoy it though, but I wouldn’t recommend it to someone getting started.

Flash especially is hard to get right.

You point the flash directly at someone and you get the least flattering picture of them you’ll ever see in your life. It’s better to use bounce flash, a diffuser, or a soft box to light someone’s face up.

In a pinch, you can use a cigarette paper over the flash as a cheap-ass diffuser. This works really well with phones too.

Anyway, if you really want to get into flash photography, well, you’ll probably want a camera that supports TTL metering. That gets expensive, quickly. Old cameras and lenses have very slow flash sync speeds, often as bad as 1/30s, which can lead to a blurry mess if you’re shooting handheld.

Flash photography is honestly such a deep subject that it deserves a whole other write up, but I’m not the right person to do that.

Scanning your own negatives is pain and suffering.

Really. It’s bad. Awful. You dust and dust and dust and still there’s specs on the film. Black and white film can’t use automatic dust removal tech, but color film can. Now you have to calibrate your monitor and your scanner to ensure there’s accurate colors coming through.

The suffering never ends with scanning. Pay someone else to do it.

Are you saying I should avoid film?

In the end, no-one cares if you shoot film, except for other film nerds. You should just know you’re getting into an expensive, time consuming hobby.

I’ve no regrets.

26 Sep 03:53

week ending 2022-09-22 BC

by Ducky

Statistics

This week’s BC CDC weekly report said that in the week ending on 17 September August there were: +637 cases, +155 hospital admissions, +27 ICU admissions, +22 all-cause deaths.

This week’s report said that the previous week (data through 10 September) there were: +574 cases, +180 hospital admissions, +31 ICU admissions, +26 all-cause deaths.

Last week’s BC CDC weekly report said that in the week ending on 10 September August there were: +574 cases, +142 hospital admissions, +31 ICU admissions, +16 all-cause deaths.

Last week’s report said that the previous week (data through 3 September) there were: +617 cases, +180 hospital admissions, +26 ICU admissions, +36 all-cause deaths.

The BC CDC dashboard says that there are 305 in hospital / 22 in ICU as of 22 September 2022.

Charts

From the BC CDC Situation Report:


From the BC CDC Variants of Concern report:


26 Sep 03:53

Strange times

by Chris Grey
What a fortnight to have been on holiday from blogging! That Liz Truss would become Prime Minister was, of course, expected; that the Queen would die two days later was not. One consequence of this conjuncture was to virtually suspend normal politics until this week thus muffling Truss’s early decisions. But perhaps we will not see a return to normal. It’s possible that the end of the Queen’s long reign will provoke new questioning of our politics. And, with more certainty, there’s a good case that Truss’s government is going to be unlike any we have seen before.

Death of a Queen

The Queen’s death was plainly a major historical event, but it might prove to be more than an event by triggering a process of national reflection. Two years ago, I wrote an article in Byline Times anticipating that her death could have such an effect, initiating or intensifying a conversation about Britain’s modern history and contemporary place in the world. There are already signs of it, interesting examples including Lewis Goodall’s reflections on ‘The Queue’ and  a debate between the twice Orwell Prize  longlisted journalist Nesrine Malik and leading historian Professor David Edgerton, though if it becomes widespread it won’t happen simply by formalised discussion so much as by a gradual seepage into public discourse.

It will entail much more than Brexit but it would encompass Brexit, if only because the Queen’s death ends one of the last direct links to the Second World War, which looms so large in the historical reference points for Brexit. But that is only a symptom of the more general way that, without many people really recognizing it at the time, the 2016 referendum was a kind of unacknowledged conversation about, precisely, Britain’s modern history and contemporary place in the world. It wasn’t explicitly couched in those terms, or if so then rarely, perhaps because both sides expected ‘remain’ to win. Nevertheless, both as an economic proposition about trading relationships and a political proposition about national sovereignty – and about how those things related to the fifty years of EU membership – this is what was at stake.

So, unwittingly, we had a national conversation but botched it by not knowing that’s what it was and not understanding what we were talking about. Largely as a result of that botch, the actual consequences of Brexit are still only gradually being uncovered. Yet, already, the blithe assumptions about the ‘sunny uplands’ that awaited Britain have been brutally exposed as false. Just this week, Truss has publicly acknowledged that one of the key ‘prizes’ (though in fact always of very limited value) of Brexit, a UK-US trade deal, is not even remotely in prospect. Though almost casually made, it is in itself a huge admission of failure, whilst exemplifying the wider failure of all the Brexiters’ promises.

The failure of Brexit

For to everyone apart from the diehards, who are so invested in Brexit that nothing would persuade them otherwise, it is now plain that Brexit has been, and will continue to be, hugely damaging to the UK. It simply isn’t viable as a strategy within a world of regional economic and trade blocs, and of trans-national regulatory systems, and no amount of ‘Global Britain’ blather can make it so. Relatedly, the idea of a post-Brexit geo-political ‘pivot’ to the Indo-Pacific, always a fantasy, has been shredded by the Ukraine War. Whether at the level of political alliances or of energy co-dependence, as well as at the levels of trade and regulation, the UK’s divorce from the EU has been exposed as folly. Hence only 27% of the population now think that leaving the EU has had a positive effect on the country (48% negative, 18% no difference, 8% don’t know), and although, despite that, 38% still think that leaving was the right thing to do (51% wrong, 11% don’t know) this has now been consistently the minority view throughout almost the entire post-referendum period.

In the process of enacting Brexit, the UK – in an almost literal sense, since the Brexiter fantasy involves a denial of geographical reality – has been misplaced in the world, and therefore the UK’s place in the world – in a metaphorical, geo-political sense – has been mislaid. Moreover, the very existence of a United Kingdom of Great Britain and Northern Ireland has been put under new strains. That is in no small part due to having treated the referendum result as binding on those parts of the Union that didn’t vote for it. And that in turn is an aspect of Brexit being pursued in the hardest of forms when, in any form, it only briefly had the narrowest margin of support even treating the whole of the UK as a single entity.

Imagine that Brexit was a success

So, in summary, whilst the Queen’s death, marking a fracture in, effectively, the entire post-war history of the nation, would always have had the possibility of opening profound questions about modern Britain, that possibility has been given a particular salience by Brexit. For her death has coincided with the pursuit of a new national strategy which is manifestly failing and which is clearly and consistently lacking majority support (and, as an interesting aside, it seems to have been inadvertently revealed this week that, contrary to reports at the time, the Queen was not in favour of Brexit).

Indeed, it’s highly revealing to consider just how different things would be if Brexit had been anything approaching the success its advocates promised. In those circumstances, the national mood (if there is really such a thing) would undoubtedly still be reflective about Queen Elizabeth’s death, but at the same time confident and united in being at the start of the exciting new journey of national renewal which Brexit was already beginning to deliver. Perhaps it would even be quietly muttered that, noble as the Queen had been, she had reigned over a nation that had given in to precisely the ‘declinism’ to which Brexit was the solution. Time, indeed, for a change to a new Carolean era of prosperity and optimism.

Instead this huge symbolic rupture with the past has overlapped with a profound disquiet about the present and the future because of Brexit. With this occurring at a time of severe and growing economic crisis, the possibility of serious national self-reflection becomes even more likely, although where it would lead is impossible to predict and not by any means assured to be either benign or unifying.

Enter Truss

The arrival of a new Prime Minister is not an unfortunate addition to this moment of historic instability. It is, as I wrote in my previous post, the latest episode in the political instability wrought by Brexit. It also marks a distinctive moment, in that it is the first time since the referendum that the free-market, deregulatory, libertarian right strand of Brexiters has been unequivocally in control of government, with advisors drawn from its numerous thinktanks.

Of course, that strand was important before and, in conjunction with the purely nationalist and anti-immigration strands, was hugely significant in pushing for Brexit in the first place and, subsequently, for insisting it meant hard Brexit. Even so, neither Theresa May nor even Boris Johnson was ever of their number, and Johnson’s 2019 majority, which Truss inherits, was predicated on the same shape-shifting about the meaning of Brexit that the Vote Leave campaign had used to secure the vote in 2016. It’s easy to forget now, but that Vote Leave team, headed by Dominic Cummings, which was also the power behind the throne for most of Johnson’s premiership, was deeply contemptuous of the ERG and most of the Brexit Ultra MPs who are now in the ascendant.

There is no doubt that they see this (£) as their first and perhaps last chance to get ‘true Brexit’, even though what they mean by that was never put to the electorate, either in 2016 or in 2019. No longer do they think it so crucial, as they once insisted when promoting Brexit, that the people choose their rulers. That means not only that the government has little democratic legitimacy but also that it will face significant constraints on implementing its agenda. The internal contradictions of the diverse voting coalition Johnson and Cummings created have not changed, whilst Truss’s popularity amongst Tory MPs is much more limited than Johnson’s in 2019 (or May’s, in the heady days preceding the 2017 election).

Moreover, whilst some people get understandably exercised by the role of ‘Tufton Street’ thinktanks, it’s worth recalling that Johnson’s government, probably no less than Truss’s, was heavily linked to such bodies. But governments are constrained in ways that thinktanks aren’t, and even the most ardently ideological libertarian SpAD quickly finds that political reality is very different to the world of position papers and whiteboards. For example, as Martin Wolf of the FT points out (£), it is hardly compatible with Hayekian doctrine for the government to be setting a national economic growth target.

Constrained Truss

More generally, the economic circumstances of the energy crisis, inflation, public service and especially NHS crises, flagging investment, and a currency in real danger of imploding – much of which has been made worse by Brexit, and all of which comes like punches on the numerous purple bruises caused by Brexit – are more severe than any British Prime Minister has faced for decades, and will place significant limits on Truss’s freedom of action. The most obvious consequence of this is that, “libertarian revolutionary” as she may be, Truss is about to preside over a huge expansion of government spending and debt to subsidise energy bills because of the unavoidable exigencies of the moment and in direct contradiction to the position she held against ‘handouts’ only a few weeks ago.

To the extent it has any coherence at all, ‘Trussonomics’ will repeatedly run up against these political constraints. Its guiding theme of ‘trickle-down’ economics, the discredited theory which informs tax cuts and the decision to lift the cap on bankers’ bonuses, is completely at odds with the politics of a country facing a profound cost-of-living crisis. Strangely, the chosen leitmotif of a relentless focus on GDP growth (£) ignores one of the populist lessons of the Brexit campaign – ‘that’s your GDP!’ – which informed the now abandoned rhetoric of the ‘levelling up agenda’.

Nor is the plan to review the restriction on working hours inherited from the EU Working Time Directive, for all that it is a longstanding cause celebre for Brexiters, one with obvious populist appeal (note, too, that it is only a ‘review’: like so many deregulatory reviews initiated since Brexit it may well come to nothing once the practical realities emerge). Indeed, as many have remarked, Truss’s early moves could hardly have given so many political hostages to the Labour Party. To that should be added the point that nor could they more clearly have fractured the already fragile coalition of support for Brexit.

Brexity Truss

In other ways, though, Truss’s approach is resolutely ‘Brexity’ and, whatever she may have thought in 2016, she must now be counted amongst the hardest of Brexiters. Apart from anything else, it is no coincidence that her economic guru is Patrick Minford, for decades an extreme and minority economic voice on Brexit and much else besides. It’s almost incredible that this Thatcherite fossil, already a schoolboy when Queen Elizabeth began her long reign, is now heavily influencing government economic policy. One thing which comes with Minford’s contrarian resentments is the familiar Brexiter hostility to the ‘economic Establishment’ and the civil service, of which the immediate decision to fire Sir Tom Scholar, Permanent Secretary to the Treasury, was emblematic.

This attracted much criticism as an unfair and foolish treatment of the civil service, but also reflects a deeper and even more dangerous thread in Brexit, which is to treat expert consensus as an ideological conspiracy against Brexiter truth. This is difficult ground, since it would plainly be wrong to assert that expert consensus is necessarily right, or is the repository of some non-ideological truth, and, for that matter, critique of the ‘Treasury view’ as a small-c conservative constraint on government ambitions isn’t confined to Brexiters. But whilst it is always possible that the consensus view in economics may be wrong, of all the forecasts of Brexit’s economic effects by far the most inaccurate was that of Minford’s Economists for Brexit group.

However, predictive track record is not what is at issue here. Ever since 2016, the repeated attacks on not just the Treasury, but the Bank of England (and especially its former Governor Mark Carney), or any individual or institution that even questions Brexiters’ claims, go well beyond the parameters of normal political and economic debate into something resembling religious sectarianism. Again, though, Truss doesn’t operate without any constraints, even in these very early days, and the backlash against Scholar’s sacking seems already to have forced her to hold back from doing the same to Sir Simon Case, the Cabinet Secretary.

Cakeist Truss

Truss also displays some of the ‘cakeism’ which did so much to transform Brexit from a political prospectus with which one might agree or disagree to a wholly dishonest project in which reality is denied in favour of quasi-religious faith. In particular, it can be seen in the ramping up of public debt to cover energy bills as well as massive tax cuts. As with the cakeism of Brexit, the issue isn’t the merits of the policy as such, which can be debated, it’s the refusal to accept that the policy comes with any costs or trade-offs. It’s as if the campaigning tactic of dismissing all Brexit costs as ‘Project Fear’ has morphed into a governmental principle that rejects any notion of risk in relation to any policy at all.

That seems especially bizarre in this case, when it has been an article of faith for the Thatcherite Right that ‘the books must balance’. There’s much to debate about that proposition, too, a debate in which context matters more than framing some general rule. In the current context, not least because of the damage Brexit has already done to sterling since the referendum, the principal risk is a full-scale run on the pound, which has already been foreshadowed in the currency markets since Truss took office on her tax-cutting platform.

It’s a risk that can only be enhanced if, as reported, the government is not going to issue an economic forecast with today’s major economic statement, as the suspicion can only be that it is dire. It’s pointless anyway as others can do the sums, and the Institute for Fiscal Studies has already warned that the government’s debt plans are “unsustainable”, highlighting the particular impact of the scale of the anticipated tax cuts but also reflecting the government’s refusal to fund the energy support package through tax-raising measures.

Delusional Truss

This matters because even if Truss and Kwasi Kwarteng were completely right to pour scorn on the Treasury view of the world that doesn’t affect the fact that currency traders hold very similar views, and act on them, and they can’t be sacked or ignored. As with Brexit dogma in general, reality can be denied and domestic critics derided as fearmongers or bullied into silence, but when that dogma meets the external world, whether that be currency markets or the negotiating power of the EU, reality always wins. Truss may put her faith in Patrick Minford but, to put it brutally and crudely, currency traders don’t give the tiniest f*** about Minford or his theories, and nor do investors or any other economic actor. Thus market analysts now say there’s a 25% chance that we will soon see pound-dollar parity for the first time ever.

The other big danger, given ballooning government debt, is that, as Mark Carney long ago warned, ‘the kindness of strangers’ could come to a ‘sudden stop’, with international investors refusing to go on funding government borrowing. For, of course, it is not ‘kindness’ that is at issue, but a cold-eyed assessment of risks and rewards. Although few commented on it at the time, I pointed out last April (see footnote) that the current account deficit was becoming alarmingly high and the forecasts were for it to get worse. Now (although we can’t be certain of the figures yet) it is set to increase dramatically.

In fact, as Paul Mason wrote this week “all six dials on the dashboard of the UK economy: inflation, investment, trade, debt, sterling and the current account [are] flashing red”. For Truss, to proclaim, as she did this week, that her “belief is that Britain’s economic fundamentals are strong” shows precisely the Brexiter logic that ‘belief’ equates with truth. Indeed it’s only last month that she was boosterishly proclaiming that recession was “not inevitable”, despite Bank of England forecasts: yesterday, as interest rates rose to their highest level since 2008, it became clear that even as she spoke the economy was already in recession.

Northern Irish uncertainties

That reality trumps faith and fantasy still isn’t a lesson that Brexiters seem able to learn, though. Nowhere is that more so than in relation to the still ongoing Northern Ireland Protocol debacle. There’s certainly a possibility of a ‘re-set’ now, especially since, as the government accepts there will be no trade deal with the US, the only barely credible reason for refusing SPS alignment has disappeared. And any half-way sensible government faced with as many serious problems as this one would certainly not add to them with the entirely self-imposed one of reneging on the Protocol. Doing so is not just profoundly damaging the UK’s international reputation and relations, it is also exacerbating the energy crisis and prolonging the damaging exclusion of the UK from EU science programmes as well as, down the line, risking the provocation of a trade war with the EU and a rupture with the US.

But, then, a half-way sensible government wouldn’t have appointed ERG Brexiter hardliners Chris Heaton-Harris and Steve ‘hard man’ Baker as, respectively, Northern Ireland Secretary and Minister, suggesting an intention for confrontation. Against that, Heaton-Harris has been making conciliatory statements, explicitly suggesting he shouldn’t be judged on his Brexiter background. In the last few hours there have also been signs of a more moderate approach from the government, with the threat of unilateral derogation from the Protocol apparently lifted (£).

If so, that’s welcome, though note that it would be a second early U-turn from Truss on her campaign promises. It’s too early to judge yet, and we’ve been round these loops of conciliation and aggression so many times, but if a deal is really going to be done expect at least rumblings of rebellion from the Ultras that, yet again, Brexit is being ‘betrayed’. Baker, in particular, who gave such strong support to Truss’s leadership bid might be expected to make that charge. The same old Brexit rows that May and Johnson faced lie in wait for Truss. Much more on all this in the coming days and weeks, no doubt.

A very peculiar government

As I write this analysis, and having read those of many others, it strikes me that it is almost impossible to articulate a coherent account of Truss’s government because it is one of the strangest this country has had. It does not even seem to know how to describe itself. It blends aspects of the economics that led to the 1976 IMF bailout – for context, just as the Queen was about to celebrate the mere Silver Jubilee of her reign – with aspects of the Thatcherism that subsequently grew out of that crisis. But whereas Thatcherites had the bedrock of realism to know that ‘you can’t buck the market’, Truss’s Brexiter Thatcherites think that true faith can trump reality.

The government’s avowed priority is economic growth, but it has an undiscussable commitment to hard Brexit, which is the most permanent drag on growth and which itself results in large part from a rejection of the Thatcher-inspired single market. It appears totally uninterested in the horrific impact of Brexit on small exporting firms, which were not only iconic for Thatcher but which are central to economic growth. In fact, it isn’t notably pro-business of any size – even banks aren’t hugely exercised about being able to pay bonuses – so much as in thrall to a ‘Janet and John’ ideological theory about capitalism that has no relationship to actual business needs and priorities.

It is a post-Brexit government, with Brexiters in every key position, but very little of its policy agenda seems to require Brexit. That is certainly true of the core tax-cutting policies, and also of the reported plans for new ‘investment zones’ with lower taxes and planning deregulation (these are different to Freeports but, as with Freeports, are nothing remotely like ‘Charter Cities’, despite the latest swell of social media excitement). Perhaps tellingly, the post of Minister for Brexit Opportunities has been axed, a tacit acknowledgment not just of its failure but of it being doomed to fail. As for the ‘Brexit Freedoms Bill’, wait for the arbitrary dates for dropping EU regulation to approach, and we’ll see what happens (much more on this in future posts, no doubt).

It is a government that presents itself as having a seriousness of purpose that Johnson’s administration lacked, but inherits his cakeism and vapid boosterism; as having a can-do pragmatism whilst asserting a populist preference for faith over evidence and expertise. It is small statist, whilst expanding the size of the state and rejecting ‘austerity’. It is deregulatory but largely coy about saying much about what this means in practice or how it would relate to the central task of creating growth. It acts as if it were a new government and has presented an entirely different manifesto to that of 2019, but it is reliant upon the voting coalition it inherited from 2019 (with consequences already emerging over Truss’s change in fracking policy) and has at best only a couple of years left of this parliament. It is an ideological government but without coherent ideology, and led by a dogmatist who is also an opportunist.

Perhaps more than anything else, and more dangerous than anything else, it seems like a government composed of chancers and mediocrities who are at the same time arrogantly certain of their beliefs. By an accident of circumstances, which in large part include Brexit, they have got hold of a power they have neither the competence nor the wisdom to be entrusted with, but to which they feel an unquestioning entitlement. This is the consequence of the winnowing out of any moderate or even half-way pragmatic figures in the Tory Party since the Brexit vote, to the extent that even someone like Rishi Sunak is now denounced as a ‘socialist’ and Brexit ‘compromiser’. About the best can be said for this government is that, despite earlier rumours, it seems it will not contain Iain Duncan Smith, John Redwood or David Frost, so, apparently, there are at least some depths to which it will not yet sink.

The bleak reality

Even if the strange coincidence of this new Prime Minister arriving just as the old Queen departs doesn’t lead to extensive national self-reflection, it does present an opportunity for the government. As Lord Ricketts, the former senior civil servant, wrote this week, the swell of international attention and goodwill occasioned by the Queen’s funeral could be a chance to repair the damage that Brexit has brought to international relations. So, too, might the moment of considerable national unity it brought be built on to assuage the divisions of Brexit.

But what seems far more likely is that the world sees a country that for all its faultless pageantry, its sombre and dignified mourning, has lost its way and is now further destabilised by losing one of its deepest anchors. Adrift since Brexit, it is now governed by a motley crew of over-promoted ideologues like Suella Braverman and self-important pinheads like Jacob Rees-Mogg, led by a peculiar and delusional Prime Minister. As the grandeur and collective emotions of the last couple of weeks fade, we are left with the bleak reality of a government not just incapable of resolving the dislocations – economic, geo-political, cultural, reputational – of Brexit and all that has followed, but incapable of recognizing their existence or even of caring were they to do so.

26 Sep 03:51

Don’t use in-app browsers

by Volker Weber

To fight Apple’s privacy protection Facebook, Instagram et al have found a way to track everything you do, by drawing you into their in-app browser and then injecting a surveillance Javascript. If you want to see what they actually do, just forward a link to InAppBrowser.com within the app and click it.

Introducing InAppBrowser.com, a simple tool to list the JavaScript commands executed by the iOS app rendering the page.

With Android, don’t even bother.

26 Sep 03:48

Rewriting the social

by Lilia

One day I should write a long story about disconnecting with the land and finding the connection again. Those with whom we get into the deep and difficult matters know that several years ago I had a difficult experience in our local homeschooling group that involved a lot of pressure to invest in group cohesion and ‘gezelligheid’, to stop talking to my kids in Russian while being in the group, and to adhere to Dutch cultural norms of how much of individual values and choice could be exercised publicly and how much of self-censorship one needs to apply to fit in*.

That resulted in an almost physiological repulsion of everything Dutch.

It has been a long and winding path to overcome that and to restore the connection with the country where I live, love, raise kids and grow garden. Step by step I rebuilt and regrew the connections to the land, language, homeschooling community, people and culture… Last weekend an unexpected change of plans brought me into a group where I could rewrite the experience of being part of the group in a way that honors the delicate balance between group cohesion and holding space for individual identities, choices, special needs, and random quirks.

So, before everyday life fully takes its toll with happenings and to-do lists I want to mark this as a milestone. With gratitude to Land van Lisa and the people who shared the space, work, food, words, and all the unspoken with me.

Finding this experience in a former school building that is being redefined and rebuilt as a learning and community space is also symbolic. As well as the fact that the connection to something happening half an hour’s distance from home is made via a homeschooling friend from another side of the country.


* Despite the stereotypical freedom of expression and creating pragmatic accommodations for all sorts of deviations, the Netherlands is very much guided by the culture of ‘doe gewoon’, adhering to the rules and social control.

The post Rewriting the social appeared first on Mathemagenic.

26 Sep 03:47

Denialists, Alarmists, and Doomists

This month I finished Michael E. Mann’s The New Climate War, a treatise on where we’re at with climate change. After my post on climate change a handful of people mentioned it and the book does a fantastic job at opening your eyes to the deflection tactics and information warfare that special interest groups (read, lobbyists) use to get you to talk about something other than their clients.

I could go on forever about ways this gross incentive system is destroying the world, but what I found most interesting about Mann’s book were the recurring cast of characters in the Climate Change disaster:

  • Activists - engaged and informed people, possibly with a plan of action
  • Inactivists - unengaged people, possibly adversarial
  • Technologists - people who (naively?) think we’ll solve the issue with technology

Mann then described in detail three more characters and how they —despite their best efforts— impede progress:

  • Denialists - people who downplay the issue and say “it’s not a big deal”.
  • Alarmists - people constantly sounding an alarm, to susceptible pile-on issues.
  • Doomists - people who go to an extreme — beyond the scientific consensus — decrying “It’s too late, we’re doomed.”

Denialists tend to be corporate shills, in for personal benefit. Alarmists tend to be somewhat correct, but ineffective in their constant foreboding. And Doomists, so concerned they’ve lost hope. All common characters, but it’s the Doomist, Mann cautions, that’s working against themselves:

By exceeding the consensus expert science, whilst claiming to be based on it, doomism feeds denialists by discrediting real science and it sets followers up for disillusion.

I hadn’t fully considered that doomism can have a net-negative effect on the cause they care so deeply about. Their doom disillusions followers, their doom discredits reality. As I thought further, my brain identified some recent interactions with Doomists in my own life; at that moment the scales fell off my eyes.

Seeing this cast of characters arrayed on the same stage struck me in a way I haven’t felt in a long time. I started time-traveling through my life seeing these characters pop up again and again; COVID-19, gun control, startups, websites, web apps, accessibility, politics, evangelicalism, non-evangelicalism, neighbors, school boards, clubs, the late 90s Houston punk rock scene, and so on. These are characters in any collection of people bound by common interest trying to enact social change.

I’ve seen the serpent-tongued Denialist, the Alarmist clanging like a gong while no one listens, the inconsolable Doomist, the Dunning-Kruger empowered Technologist, the obstinate Inactivist, the steady but exhausted Activist…

…I’ve seen these characters. I’ve been these characters! I am these characters?!

26 Sep 03:45

Unselfing

Helen Blunden, activatelearning, Sept 22, 2022
Icon

Back in the early days of blogging and syndication, there were two major approaches: linkblogging, where you would post a link and summary of someone else's work, or journaling, where you would post your own thoughts and reflections. Most of the blogs and the blogging standards opted for the latter, and that's what eventually evolved into social media. My own work (and that of a few other edubloggers) tended toward the linkblog, which is what I do to this day. Sure, it's useful to write about your own thoughts and feelings, but I've always felt it was much more useful to write about someone else's work. And that's what this post is about, twenty years later. "I do wonder," writes Helen Blunden,"if we were a society that wasn't so narcissistic and self-obsessed, where we promoted and shared the great work of others, if somehow we could become 'more real' again and in touch with ourselves." I think it would make online discourse much more interesting and engaging, to be sure. And - obviously, I think - we would all learn a lot more.

Web: [Direct Link] [This Post]
26 Sep 03:44

Future

Matthias Melcher, x28's New Blog, Sept 23, 2022
Icon

Matthias Melcher offers some comments on my recent presentation, most of which I am in agreement with (and especially the notions of 'scientific apprenticeships' and 'simple things like annotation and rearranging'). He raises, though, a common concern about the notion of consensus, saying "majority is not a poor surrogate of what the consensus of the wisest big heads would achieve". I offer a response in the comments, noting that sort of consensus I am talking about refers to agreement on a source-of-truth mechanism (voting, documentation, provenance, whatever) as opposed to making political and economic decisions by agreement.

Web: [Direct Link] [This Post]
26 Sep 03:44

Becoming an AI-fueled organization

Jason Girzadas, Deloitte, Sept 23, 2022
Icon

I said in my presentation the other day that AI is already here. This report (28 page PDF) from Deloitte confirms it. "In fact, more than a quarter of our survey respondents have reached full-scale deployment of five or more types of AI applications within their organization," report the authors. As always, deployment of the technology varies (and though the survey doesn't report, I would expect educational institutions to be laggards). And, as the report says, deployment is never finished, as the technology, and the use cases, continue to evolve. Finally, as always, "Technology cannot deliver transformative results unless organizations reimagine how work gets done." In the case of AI, things like data fluency and trust in agility are required to make it work. Via Irving Wladawsky-Berger.

Web: [Direct Link] [This Post]
26 Sep 03:44

Viewpoint Diversity

Alex Usher, Higher Education Strategy Associates, Sept 23, 2022
Icon

Alex Usher offers a good take-down of a recent MacDonald-Laurier Institute paper (54 page PDF) on "viewpoint diversity" at Canadian Universities, calling it "Really bad. Icelandic rotting shark bad. Crystal Pepsi bad. Final Season of Game of Thrones bad." He then offers a point by point demolition. The sample is biased. The scale is polarized. The survey measures the wrong thing. It has an absurd definition of 'academic freedom'. Etc. My own take on all of this is that 'viewpoint diversity' in Canadian academia (or anywhere else) can be an issue when and only when the demand for 'viewpoint diversity' is applied to corporate boardrooms and financial institutions. That's not going to happen, of course. So why don't we just let people believe what they believe and do the jobs they do?

Web: [Direct Link] [This Post]
26 Sep 03:43

Raspberry Pi Zero 2 Review (It Sucks)

by jamesachambers
This is what not giving a shit about your product looks likeIt's no secret that there's a massive Raspberry Pi shortage right now for consumers due to the Raspberry Pi Foundation's strategy of allocating boards to business and IoT projects first and then official resellers get whatever is left. Since I own one of every Raspberry Pi that has ever been made I ended up buying one of these scalped from eBay for about $57 (all the way from Europe) which is much better than the street price of $80-$90+ I can get it in the United States. In other words I payed the scalpers to get my hands on this board so you don't have to and let me tell you folks, it is honestly garbage and literally the dumbest purchase I've ever made. Today I will show and explain why the Raspberry Pi Zero 2 is the lamest and most disappointing product release that the Raspberry Pi Foundation has ever done. Let's get started!

Source

26 Sep 03:41

Trilogy-Lovin’ Me/To Make A Woman Feel Wanted/Peace Of Mind

by bob
Spotify playlist: https://spoti.fi/3Sa9aJc 1 “I want it all I want it all I want it all AND I WANT IT NOW!” I’ve been in a funk. It’s been this way since I came back from Colorado after Labor Day. It’s not like the time in Vail was a hedonistic romp, rather it was a working […]
26 Sep 03:41

2022-09-23 BC tiny

by Ducky

Here are the wastewater charts from Jeff’s spreadsheet:

Note: last weeks’ chart was the same as two weeks ago because I uh didn’t realize I needed to refresh the Google Sheets page. (In my defense, Google Docs updates without refreshing.)

26 Sep 03:40

Get in Zoomer, We're Saving React

Looking back, and forward

Zoomer

Lately, it seems popular to talk smack about React. Both the orange and red site recently spilled the tea about how mean Uncle React has been, and how much nicer some of these next-gen frameworks supposedly are.

I find this bizarre for two reasons:

  • Most next-gen React spin-offs strike me as universally regressive, not progressive.
  • The few exceptions don't seem to have any actual complex, battle-hardened apps to point to, to prove their worth.

Now, before you close this tab thinking "ugh, not another tech rant", let me first remind you that a post is not a rant simply because it makes you angry. Next, let me point out that I've been writing code for 32 years. You should listen to your elders, for they know shit and have seen shit. I've also spent a fair amount of time teaching people how to get really good at React, so I know the pitfalls.

You may also notice that not even venerated 3rd party developers are particularly excited about React 18 and its concurrent mode, let alone the unwashed masses. This should tell you the React team itself is suffering a bit of an existential crisis. The framework that started as just the V in MVC can't seem to figure out where it wants to go.

So this is not the praise of a React fanboy. I built my own clone of the core run-time, and it was exactly because its limitations were grating, despite the potential there. I added numerous extensions, and then used it to tackle one of the most challenging domains around: GPU rendering. If one person can pull that off, that means there's actually something real going on here. It ties into genuine productivity boons, and results in robust, quality software, which seems to come together as if by magic.

To put it differently: when Figma recently announced they were acquired for $20B by Adobe, we all intuitively understood just how much of an exceptional black swan event that was. We know that 99.99…% of software companies are simply incapable of pulling off something similar. But do we know why?

Zoomer

IBM logo

Where we came from

If you're fresh off the boat today, React can seem like a fixture. The now-ancient saying "Nobody ever got fired for choosing IBM" may as well be updated for React. Nevertheless, when it appeared on the scene, it was wild: you're going to put the HTML and CSS in the JavaScript? Are you mad?

Yes, it was mad, and like Galileo, the people behind React were completely right, for they integrated some of the best ideas out there. They were so right that Angular pretty much threw in the towel on its abysmal two-way binding system and redesigned it to adopt a similar one-way data flow. They were so right that React also dethroned the previous fixture in web land, jQuery, as the diff-based Virtual DOM obsoleted almost all of the trickery people were using to beat the old DOM into shape. The fact that you could use e.g. componentDidUpdate to integrate legacy code was just a conceit, a transition mechanism that spelled out its own obsolescence as soon as you got comfortable with it.

Angular Template

Many competing frameworks acted like this wasn't so, and stuck to the old practice of using templates. They missed the obvious lesson here: every templating language inevitably turns into a very poor programming language over time. It will grow to add conditionals, loops, scopes, macros, and other things that are much nicer in actual code. A templating language is mainly an inner platform effect. It targets a weird imagined archetype of someone who isn't allergic to code, but somehow isn't smart enough to work in a genuine programming language. In my experience, this archetype doesn't actually exist. Designers don't want to code at all, while coders want native expressiveness. It's just that simple.

Others looked at the Virtual DOM and only saw inefficiency. They wanted to add a compiler, so they could reduce the DOM manipulations to an absolute minimum, smugly pointing to benchmarks. This was often just premature optimization, because it failed to recognize the power of dynamic languages: that they can easily reconfigure their behavior at run-time, in response to data, in a turing-complete way. This is essential for composing grown up apps that enable user freedom. The use case that most of the React spin-offs seem to be targeting is not apps but web sites. They are paving cow paths that are well-worn with some minor conveniences, while never transcending them.

var RouterMixin = {
  contextTypes: {
    router: React.PropTypes.object.isRequired
  },

  // The mixin provides a method so that
  // components don't have to use the
  // context API directly.
  push: function(path) {
    this.context.router.push(path)
  }
};

var Link = React.createClass({
  mixins: [RouterMixin],

  handleClick: function(e) {
    e.stopPropagation();

    // This method is defined in RouterMixin.
    this.push(this.props.to);
  },

  render: function() {
    return (
      <a onClick={this.handleClick}>
        {this.props.children}
      </a>
    );
  }
});

module.exports = Link;

React circa 2016

It's also easy to forget that React itself had many architectural revisions. When old farts like me got in on it, components still had mix-ins, because genuine classes were a distant dream in JS. When ES classes showed up, React adopted those, but it didn't fundamentally change the way you structured your code. It wasn't until React 16.8 (!) that we got hooks, which completely changed the way you approached it. This reduced the necessary boilerplate by an order of magnitude, and triggered a cambrian explosion of custom hook development. That is, at least until the buzz wore off, and only the good ideas remained standing.

Along the way, third party React libraries have followed a similar path. Solutions like Redux appeared, got popular, and then were ditched as people realized the boilerplate just wasn't worth it. It was a necessary lesson to learn.

This legacy of evolution is also where the bulk of React's perceived bloat sits today. As browsers evolved, as libraries got smarter, and as more people ditched OO, much of it is now indeed unnecessary for many use cases. But while you can tweak React with a leaner-and-meaner reimplementation, this doesn't fundamentally alter the value proposition, or invalidate the existing appeal of it.

The fact remains that before React showed up, nobody really had any idea how to make concepts like URL routers, or drag and drop, or UI design systems, truly sing, not on the web. We had a lot of individual pieces, but nothing solid to puzzle them together with. Nevertheless, there is actual undiscovered country beyond, and that's really what this post is about: looking back and looking forward.

If there's one solid criticism I've heard of React, it's this: that no two React codebases ever look alike. This is generally true, but it's somewhat similar to another old adage: that happy families all look alike, but every broken family is broken in its own particular way. The reason bad React codebases are bad is because the people who code it have no idea what they're supposed to be doing. Without a model of how to reason about their code in a structured way, they just keep adding on hack upon hack, until it's better to throw the entire thing away and start from scratch. This is no different from any other codebase made up as they go along, React or not.

Where React came from is easy to explain, but difficult to grok: it's the solution that Facebook arrived at, in order to make their army of junior developers build a reliable front-end, that could be used by millions. There is an enormous amount of hard-earned experience encoded in its architecture today. Often though, it can be hard to sort the wheat from the chaff. If you stubbornly stick to what feels familiar and easy, you may never understand this. And if you never build anything other than a SaaS-with-forms, you never will.

I won't rehash the specifics of e.g. useEffect here, but rather, drop in a trickier question: what if the problem people have with useEffect + DOM Events isn't the fault of hooks at all, but is actually the fault of the DOM?

I only mention it because when I grafted an immediate-mode style interaction model onto my React clone instead, I discovered that complex gesture controllers suddenly became 2-3x shorter. What's more, declaring data dependencies that "violate the rules of React" wasn't an anti-pattern at all: it was actually key to the entire thing. So when I hear that people are proudly trying to replace dependencies with magic signals, I just shake my head and look elsewhere.

Which makes me wonder… why is nobody else doing things like this? Immediate mode UI isn't new, not by a long shot. And it's hardly the only sticking point.

Mac OS X - Leopard (2007)

Mac OS X Leopard - 2007

Where we actually came from

Here's another thing you may not understand: just how good old desktop software truly was.

The gold standard here is Mac OS X, circa 2008. It was right before the iPhone, when Apple was still uniquely focused on making its desktop the slickest, most accessible platform around. It was a time when sites like Ars Technica still published real content, and John Siracusa would lovingly post multi-page breakdowns of every new release, obsessing over every detail for years on end. Just imagine: tech journalists actually knowing the ins-and-outs of how the sausage was made, as opposed to copy/pasting advertorials. It was awesome.

This was supported by a blooming 3rd party app ecosystem, before anyone had heard of an App Store. It resulted in some genuine marvels, which fit seamlessly into the design principles of the platform. For example, Adium, a universal instant messenger, which made other open-source offerings seem clunky and downright cringe. Or Growl, a universal notification system that paired seamlessly with it. It's difficult to imagine this not being standard in every OS now, but Mac enthusiasts had it years before anyone else.

Adium IM Client

The monopolistic Apple of today can't hold a candle to the extended Apple cinematic universe from before. I still often refer to the Apple Human Interface Guidelines from that era, rather than the more "updated" versions of today, which have slowly but surely thrown their own wisdom in the trash.

The first section of three, Application Design Fundamentals, has almost nothing to do with Macs specifically. You can just tell from the chapter titles:

  • The Design Process
  • Characteristics of Great Software
  • Human Interface Design
  • Prioritizing Design Decisions
Apple Human Interface Guidelines 2008 - Outline

Like another favorite, The Design of Everyday Things, it approaches software first and foremost as tools designed for people to use. The specific choices made in app design can be the difference between something that's a joy to use and something that's resented and constantly fought against.

So what exactly did we lose? It's quite simple: by moving software into the cloud and turning them into web-based SaaS offerings, many of the basic affordances that used to be standard have gotten watered down or removed entirely. Here are some examples:

Menus let you cross over empty space and other menu items, instead of strictly enforcing hover rectangles.

You can drag and drop the file icon from a document's titlebar e.g. to upload it, instead of having to go look for it again.

Holding keyboard modifiers like CTRL or ALT is reflected instantly in menus, and used to make power-user features discoverable-yet-unobtrusive.

And here are some more:

  • You can browse years of documents, emails, … and instantly draft new ones. Fully off-line, with zero lag.
  • You can sort any table by any column, and it will remember prior keys to produce a stable sort for identical values.
  • Undo/redo is standard and expected, even when moving entire directories around in the Finder.
  • Copy/pasting rich content is normal, and entirely equivalent to dragging and dropping it.
  • When you rename or move a file that you're editing, its window instantly reflects the new name and location.
  • You can also drag a file into an "open file" dialog, to select it there.
  • When downloading a file, the partial download has a progress bar on the icon. It can be double clicked to resume, or even copied to another machine.

It's always amusing to me to watch a power user switch to a Mac late in life, because much of their early complaints stem from not realizing there are far more obvious ways to do what they've trained themselves to do in a cumbersome way.

On almost every platform, PDFs are just awful to use. Whereas out-of-the-box on a Mac, you can annotate them to your heart's content, or drag pages from one PDF to another to recompose it. You can also sign them with a signature read from your webcam, for those of us who still know what pens are for. This is what happens when you tell companies like Adobe to utterly stuff it and just show them how it's supposed to be done, instead of waiting for their approval. The productivity benefits were enormous.

Fountain pen

As an aside, if all of this seems quaint or positively boomeresque, here's a tip: forcing yourself to slow down and work with information directly, with your hands, manipulating objects physical or virtual, instead of offloading it all to a cloud… this is not an anti-pattern. Neither is genuine note taking on a piece of paper. You should try it sometime.

At the time, many supposed software experts scoffed at Apple, deriding their products as mere expensive toys differentiated purely by "marketing". But this is the same company that seamlessly transitioned its entire stack from PowerPC, to x86, to x64, and eventually ARM, with most users remaining blissfully unaware this ever took place.

This is what the pinnacle of our craft can actually look like.

Apple didn't just knock it out of the park when it came to the OS or the overall UI: they also shipped powerful first-party apps like iMovie and Keynote, which made competing offerings look positively shabby. Steve Jobs used them for his own keynotes, arguably the best in the business.

Similarly, what set the iPhone apart was not just its touch interface, but that they actually ported a mature media and document stack to mobile wholesale. At that time, the "mobile web" was a complete and utter joke, and it would take Android years to catch up, whether it was video or music, or basic stuff like calendar invites and contacts.

It has nothing to do with marketing. Indeed, while many companies have since emulated and perfected their own Apple-style pitch, almost no-one manages to get away from that tell-tale "enterprise" feel. They don't know or care how their users actually want to use their products: the people in charge don't have the first clue about the fundamentals of product design. They just like shiny things when they see them.

iMovie (2010)

iMovie - 2010

The Reactive Enterprise

What does any of this have to do with React? Well it's very simple. Mac OS X was the first OS that could actually seriously claim to be reactive.

The standard which virtually everyone emulated back then was Windows. And in Windows, the norm—which mostly remains to this day—is that when you query information, that information is fetched once, and never updated. The user was just supposed to know that in order to see it update, they had to manually refresh it, either by bumping a selection back and forth, or by closing and reopening a dialog.

Windows 95

Windows 95

The same applied to preferences: in Windows land, the established pattern was to present a user with a set of buttons, the triad of Ok, Cancel and Apply. This is awful, and here's why. If you click Ok, you are committing to a choice you haven't yet had the chance to see the implications of. If you click Cancel, you are completely discarding everything you did, without ever trying it out. If you click Apply, it's the same as pressing Ok, just the window stays open. None of the 3 buttons let you interact confidently, or easily try changes one by one, reinforcing the idea that it's the user's fault for being "bad at computers" if it doesn't do what they expect, or they don't know how to back out.

The bold Mac solution was that toggling a preference should take effect immediately. Even if that choice affects the entire desktop, such as changing the UI theme. So if that's not what you wanted, you simply clicked again to undo it right away. Macs were reactive, while Windows was transactional. The main reason it worked this way was because most programmers had no clue how to effectively make their software respond to arbitrary changes, and Microsoft couldn't go a few years without coming up with yet another ill-conceived UI framework.

This divide has mostly remained, with the only notable change being that on mobile devices, both iOS and Android tend to embrace the reactive model. However, given that much of the software used is made partially or wholly out of web views, this is a promise that is often violated and rarely seen as an inviolable constraint. It's just a nice-to-have. Furthermore, while it has become easier to display reactive information, the crucial second half of the equation—interaction—remains mostly neglected, also by design.

I'm going to be cheeky and say if there's anyone who should take the blame for this, it's back-end engineers and the technology choices they continue to make. The very notion of "back-end" is a fallacy: it implies that one can produce a useful, working system, without ever having to talk to end-users.

Just imagine how alien this concept would be to an engineer before the software revolution happened: it'd be like suggesting you build a bridge without ever having to think about where it sits or who drives over it, because that's just "installation" and "surfacing". In civil engineering, catastrophes are rare, and each is a cautionary tale, never to be repeated: the loss of life was often visceral and brutal. But in software, we embraced never learning such lessons.

A specific evil here is the legacy of SQL and the associated practices, which fragments and normalizes data into rigid tables. As a result, the effect of any change is difficult to predict, and virtually impossible to reliably undo or synchronize after the fact.

This is also the fault of "enterprise", in a very direct sense: SQL databases and transactions are mainly designed to model business processes. They evolved to model bureaucratic workflows in actual enterprises, with a clear hierarchy of command, a need to maintain an official set of records, with the ability for auditing and oversight.

However, such classic enterprises were of course still run by people, by individuals. The bulk of the work they did was done offline, producing documents, spreadsheets and other materials through direct interaction and iteration. The bureaucracy was a means to an end, it wasn't the sole activity. The idea of an organization or country run entirely on bureaucracy was the stuff people made satirical movies about.

And yet, many jobs now follow exactly this template. The activity is entirely coordinated and routed through specific SaaS apps, either off-the-shelf or bespoke, which strictly limit the available actions. They only contain watered down mockeries of classic desktop concepts such as files and folders, direct manipulation of data, and parallel off-line workstreams. They have little to no affordances for drafts, iteration or errors. They are mainly designed to appeal to management, not the riff-raff.

The promise of adopting such software is that everything will run more smoothly, and that oversight becomes effortless thanks to a multitude of metrics and paper trails. The reality is that you often replace tasks that ordinary, capable employees could do themselves, with a cumbersome and restrictive process. Information becomes harder to find, mistakes are more difficult to correct, and the normal activity of doing your job is replaced with endless form filling, box-ticking and notification chasing. There is a reason nobody likes JIRA, and this is it.

What's more, by adopting SaaS, companies put themselves at the mercy of someone else's development process. When dealing with an unanticipated scenario, you often simply can't work around it with the tools given, by design. It doesn't matter how smart or self-reliant the employees are: the software forces them to be stupid, and the only solution is to pay the vendor and wait 3 months or more.

For some reason, everyone has agreed that this is the way forward. It's insane.

Oracle Cloud Stuff

Oracle Cloud with AI Bullshit

Circling Back

Despite all its embedded architectural wisdom, this is a flaw that React shares: it was never meant to enable user freedom. Indeed, the very concept of Facebook precludes it, arguably the world's biggest lock-in SaaS. The interactions that are allowed there are exactly like any other SaaS: GET and POST to a monolithic back-end, which enforces rigid processes.

As an app developer, if you want to add robust undo/redo, comfy mouse interactions and drag-n-drop, keyboard shortcuts, and all the other goodies that were standard on the desktop, there are no easy architectural shortcuts available today. And if you want to add real-time collaboration, practically a necessity for real apps, all of these concerns spill out, because they cannot be split up neatly into a wholly separate front-end and back-end.

A good example is when people mistakenly equate undo/redo with a discrete, immutable event log. This is fundamentally wrong, because what constitutes an action from user's point of view is entirely different from how a back-end engineer perceives it. For example undo/redo needs to group multiple operations to enable sensible, logical checkpoints… but it also needs to do so on the fly, for actions which are rapid and don't conflict.

If you don't believe me, go type some text in your text editor and see what happens when you press CTRL-Z. It won't erase character by character, but did you ever think about that? Plus, if multiple users collaborate, each needs their own undo/redo stack, which means you need the equivalent of git rebasing and merging. You'd be amazed how many people don't realize this.

If we want to move forward, surely, we should be able to replicate what was normal 20 years ago?

SupaBase TinyBase RxDB
Real-time databases

There are a few promising things happening in the field, but they are so, so rare… like the slow-death-and-rebirth of Firebase into open-source alternatives and lookalikes. But even then, robust real-time collaboration remains a 5-star premium feature.

Similarly, big canvas-based apps like Figma, and scrappy upstarts like TLDraw have to painstakingly reinvent all the wheels, as practically all the relevant knowledge has been lost. And heaven forbid you actually want a decent, GPU-accelerated renderer: you will need to pay a dedicated team of experts to write code nobody else in-house can maintain, because the tooling is awful and also they are scared of math.

What bugs me the most is that the React dev team and friends seem extremely unaware of any of this. The things they are prioritizing simply don't matter in bringing the quality of the resulting software forward, except at the margins. It'll just load the same HTML a bit faster. If you stubbornly refuse to learn what memo(…) is for, it'll render slightly less worse. But the advice they give for event handling, for data fetching, and so on… for advanced use it's simply wrong.

A good example is that GraphQL query subscriptions in Apollo split up the initial GET from the subsequent SUBSCRIBE. This means there is always a chance one or more events were dropped in between the two. Nevertheless, this is how the library is designed, and this is what countless developers are doing today. Well okay then.

Another good example is implementing mouse gestures, because mouse events happen quicker than React can re-render. Making this work right the "proper way" is an exercise in frustration, and eventually you will conclude that everything you've been told about non-HTML-element useRef is a lie: just embrace mutating state here.

In fact, despite being told this will cause bugs, I've never had any issues with it in React 17. This leads me to suspect that what they were really doing was trying to prevent people from writing code that would break in React 18's concurrent mode. If so: dick move, guys. Here's what I propose: if you want to warn people about "subtle bugs", post a concrete proof, or GTFO.

* * *

If you want to build a truly modern, robust, desktop-class web app with React, you will find that you still need to pretty much make apple pie from scratch, by first re-inventing the entire universe. You can try starting with the pre-made stuff, but you will hit a wall, and/or eventually corrupt your users' data. It's simply been my experience, and I've done the React real-time collaboration rodeo with GPU sprinkles on top multiple times now.

Crucially, none of the React alternatives solve this, indeed, they mostly just make it worse by trying to "helpfully" mutate state right away. But here's the annoying truth: you cannot skip learning to reason about well-ordered orchestration. It will just bite you in the ass, guaranteed.

What's really frustrating about all this is how passive and helpless the current generation of web developers seem to be in all this. It's as if they've all been lulled into complacency by convenience. They seem afraid to carve out their own ambitious paths, and lack serious gusto for engineering. If there isn't a "friendly" bot spewing encouraging messages with plenty of 👏 emoji at every turn, they won't engage.

As someone who took a classical engineering education, which included not just a broad scientific and mathematical basis, but crucially also the necessary engineering ethos, this is just alien to me. Call me cynical all you want, but it matches my experience. Coming after the generation that birthed Git and BitTorrent, and which killed IE with Firefox and Konqueror/WebKit, it just seems ridiculous.

Fuck, most zoomers don't even know how to dance. I don't mean that they are bad at dancing, I mean they literally won't try, and just stand around awkwardly.

Just know: nobody else is going to do it for you. So what are you waiting for?

26 Sep 03:37

How Fastmail helped pave a path toward this career changer’s dream job

by The Fastmail Team
How Fastmail helped pave a path toward this career changer’s dream job

Support Developer Rasha Moumneh shares her journey from nonprofit worker to tech pro.


This post was originally published on Technical.ly.

Rasha Moumneh was burnt out. The former human rights professional felt the impact of her work in nonprofits being stifled by internal politics, and her foray into academia — a master’s in anthropology from University College London, followed by acceptance into Rutgers University for her Ph.D. in 2013 — left her uninspired. She had dreams of becoming a developer in the tech industry but believed it was impossible without a degree in computer science.

Enter Fastmail. Moumneh, originally from Lebanon, but by then a resident of Philadelphia, was on the hunt for a tech company with a collaborative culture that would teach her the ropes when she discovered the independent email provider. Though she didn’t yet have the technical experience to become a developer, she was offered an entry-level role on the tech support team with opportunities for growth.

“I had a clear path that I wanted to go down,” Moumneh said. “I wanted to work at a place where I could get a lot of facetime with developers that also held a strong set of ethics. I found that at Fastmail.”

Fastmail is an Australian email provider that values customer privacy and being a “good internet citizen” above all else. For as little as $3 per month, the company provides completely ad-free products and services, protects customer data, and eliminates unwanted solicitations. In 2015, Fastmail acquired Pobox — created by Penn student, and now COO of Fastmail, Helen Horstmann-Allen. The acquisition gave Fastmail permanent residence in Philadelphia, elevating the 40-person company (17 in Philly) into a small, but global business.

How Fastmail helped pave a path toward this career changer’s dream job

A benefit of working at a smaller company, Moumneh said, is that managers have the time and bandwidth to pay close attention to employee growth and development.

“Managers are aware that no one wants to stay in an entry-level role forever, ” she said. “There is constant encouragement for us to pursue our interests, especially for those of us on the support team. Early on, I made my career intentions clear and there has been a constant effort towards helping me upskill, both in the type of projects I am given and through the CTO’s wonderful mentorship.”

While Moumneh received extensive on-the-job training as a Fastmail tech support specialist, she used her annual professional development budget to enroll in the Lambda School’s coding boot camp. After she graduated, and only a little over a year after she joined the support team, Moumneh was proudly promoted to her current role of support developer.

Now, Moumneh bridges the gap between the support staff and developers, escalating support problems that become too technical and performing development tasks. The work, and the company’s values, have been a welcome reprieve from the stressful and red-tape-laced world of the human rights field, she said.

“I wanted to work for a company where I could wake up, look at myself in the mirror and feel OK,” she said. “Fastmail promotes the idea of being a ‘good internet citizen.’ It genuinely values people’s privacy. It holds a set of ethics in today’s internet landscape and maintains a human-centered focus.”

It also, as Moumneh learned firsthand, sees value in people, and supports their growth, whether they are sought-after tech experts or trying to build a brand new career.

“Fastmail is a company that invests in its people,” Moumneh said. “There are paths forward for people in all positions.”


At Fastmail, we work on making email better for everyone. Building an incredible team is an important part of our company and our values. Check out our job listings page for opportunities to join our team!

26 Sep 03:35

The Least Assholish Person

@ellesystem wrote:

What actually makes a good first language? I’m sure CS ed has grappled with this, but it seems like a daunting question.

@elliewix replied

The language the least assholish person willing to teach in is the best. Anything that’s the best can still be made the worst by the instructor. (sorry, I’m glad people are looking at this but I’m just so exhausted and burned out from the entire enterprise of everthing)

She’s right, which means my one-day class on how to teach has a new and better goal: to make people the least assholish teachers they can possibly be.

26 Sep 03:35

Clothes and connection

by russell davies
26 Sep 03:34

Post-Fiona Back Yard

by peter@rukavina.net (Peter Rukavina)
19 Sep 16:18

Deploying Python web apps as AWS Lambda functions

by Simon Willison

I've been wanting to figure out how to do this for years. Today I finally put all of the pieces together for it.

AWS Lambda can host functions written in Python. These are "scale to zero" - my favourite definition of serverless! - which means you only pay for the traffic that they serve. A project with no traffic costs nothing to run.

You used to have to jump through a whole bunch of extra hoops to get a working URL that triggered those functions, but in April 2022 they released Lambda Function URLs and dramatically simplified that process.

There are still a lot of steps involved though. Here's how to deploy a Python web application as a Lambda function.

Set up the AWS CLI tool

I did this so long ago I can't remember how. You need an AWS account and you need to have the AWS CLI tool installed and configured.

The aws --version should return a version number of 1.22.90 or higher, as that's when they added function URL support.

I found I had too old a version of the tool. I ended up figuring out this as the way to upgrade it:

head -n 1 $(which aws)

Output:

#!/usr/local/opt/python@3.9/bin/python3.9

This showed me the location of the Python environment that contained the tool. I could then edit that path to upgrade it like so:

/usr/local/opt/python@3.9/bin/pip3 install -U awscli

Create a Python handler function

This is "hello world" as a Python handler function. Put it in lambda_function.py:

def lambda_handler(event, context): 
    return {
        "statusCode": 200,
        "headers": {
            "Content-Type": "text/html"
        },
        "body": "<h1>Hello World from Python</h1>"
    }

Add that to a zip file

This is the part of the process that I found most unintuitive. Lambda functions are deployed as zip files. The zip file needs to contain both the Python code AND all of its dependencies - more on that to come.

Our first function doesn't have any dependencies, which makes things a lot easier. Here's how to turn it into a zip file ready to be deployed:

zip function.zip lambda_function.py

Create a role with a policy

You only have to do this the first time you deploy a Lambda function. You need an IAM role that you can use for the other steps.

This command creates a role called lambda-ex:

aws iam create-role \
  --role-name lambda-ex \
  --assume-role-policy-document '{
    "Version": "2012-10-17",
    "Statement": [{
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Action": "sts:AssumeRole"}
    ]}'

Then you have to run this. I don't know why this can't be handled as part of the create-role command, but it's necessary:

aws iam attach-role-policy \
  --role-name lambda-ex \
  --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

Find your AWS account ID

You need to know your AWS account ID for the next step.

You can find it by running this command:

aws sts get-caller-identity \
  --query "Account" --output text

I assigned it to an environment variable so I could use it later like this:

export AWS_ACCOUNT_ID=$(
  aws sts get-caller-identity \
  --query "Account" --output text
)

Run this to confirm that worked:

echo $AWS_ACCOUNT_ID

Deploy that function

Now we can deploy the zip file as a new Lambda function!

Pick a unique function name - I picked lambda-python-hello-world.

Then run the following:

aws lambda create-function \
  --function-name lambda-python-hello-world \
  --zip-file fileb://function.zip \
  --runtime python3.9 \
  --handler "lambda_function.lambda_handler" \
  --role "arn:aws:iam::${AWS_ACCOUNT_ID}:role/lambda-ex"

We're telling it to deploy our function.zip file using the python3.9 runtime.

We list lambda_function.lambda_handler as the handler because our Python file was called lambda_function.py and the function was called lambda_handler.

If all goes well you should get back a response from that command that looks something like this:

{
    "FunctionName": "lambda-python-hello-world",
    "FunctionArn": "arn:aws:lambda:us-east-1:462092780466:function:lambda-python-hello-world",
    "Runtime": "python3.9",
    "Role": "arn:aws:iam::462092780466:role/lambda-ex",
    "Handler": "lambda_function.lambda_handler",
    "CodeSize": 332,
    "Description": "",
    "Timeout": 3,
    "MemorySize": 128,
    "LastModified": "2022-09-19T02:27:18.213+0000",
    "CodeSha256": "Y1nCZLN6KvU9vUmhHAgcAkYfvgu6uBhmdGVprq8c97Y=",
    "Version": "$LATEST",
    "TracingConfig": {
        "Mode": "PassThrough"
    },
    "RevisionId": "316481f5-7934-4e54-914f-6b075bb7d9dd",
    "State": "Pending",
    "StateReason": "The function is being created.",
    "StateReasonCode": "Creating",
    "PackageType": "Zip",
    "Architectures": [
        "x86_64"
    ],
    "EphemeralStorage": {
        "Size": 512
    }
}

Grant permission for it to be executed

This magic command is also necessary for everything to work:

aws lambda add-permission \
  --function-name lambda-python-hello-world \
  --action lambda:InvokeFunctionUrl \
  --principal "*" \
  --function-url-auth-type "NONE" \
  --statement-id url

Give it a Function URL

We need a URL that we can access in a browser to trigger our function.

Here's how to add a new Function URL to our deployed function:

aws lambda create-function-url-config \
  --function-name lambda-python-hello-world \
  --auth-type NONE

That --auth-type NONE means anyone on the internet will be able to trigger the function by visiting the URL.

This should return something like the following:

{
    "FunctionUrl": "https://m2jatdfy4bulhvsfcrfc6sfw2i0bjfpx.lambda-url.us-east-1.on.aws/",
    "FunctionArn": "arn:aws:lambda:us-east-1:462092780466:function:lambda-python-hello-world",
    "AuthType": "NONE",
    "CreationTime": "2022-09-19T02:27:48.356967Z"
}

And sure enough, https://m2jatdfy4bulhvsfcrfc6sfw2i0bjfpx.lambda-url.us-east-1.on.aws/ now returns "Hello World from Python".

Updating the function

Having deployed the function, updating it is pleasantly easy.

You create a new function.zip file - which I do like this:

rm -f function.zip # Delete if it exists
zip function.zip lambda_function.py 

And then deploy the update like so:

aws lambda update-function-code \
  --function-name lambda-python-hello-world \
  --zip-file fileb://function.zip

Adding pure Python dependencies

Adding dependencies to the project was by far the most confusing aspect of this whole process.

Eventually I found a good way to do it thanks to the example code published to accompany this YouTube video by Pixegami.

The trick is to include ALL of your dependencies in the root of your zip file.

Forget about requirements.txt and suchlike - you need to install copies of the actual dependencies themselves.

Here's the recipe that works for me. First, create a requirements.txt file listing your dependencies:

cowsay

Now use the pip install -t command to install those requirements into a specific directory - I use lib:

python3 -m pip install -t lib -r requirements.txt

Run ls -lah lib to confirm that the files are in there.

ls lib | cat
bin
cowsay
cowsay-5.0-py3.10.egg-info

Now use this recipe to add everything in lib to the root of your zip file:

(cd lib; zip ../function.zip -r .)

You can run this command to see the list of files in the zip:

unzip -l function.zip

Let's update lambda_function.py to demonstrate the cowsay library:

import cowsay


def lambda_handler(event, context): 
    return {
        "statusCode": 200,
        "headers": {
            "Content-Type": "text/plain"
        },
        "body": cowsay.get_output_string("pig", "Hello world, I am a pig")
    }

Add that updated lambda_function.py to the zip file again:

zip function.zip lambda_function.py

Deploy the update:

aws lambda update-function-code \
  --function-name lambda-python-hello-world \
  --zip-file fileb://function.zip

Hit refresh on the URL from earlier and you should see:

  _______________________
| Hello world, I am a pig |
  =======================
                       \
                        \
                         \
                          \
                                    ,.
                                   (_|,.
                                   ,' /, )_______   _
                               __j o``-'        `.'-)'
                               (")                 \'
                               `-j                |
                                 `-._(           /
                                    |_\  |--^.  /
                                   /_]'|_| /_)_/
                                       /_]'  /_]'

Advanced Python dependencies

The above recipe works fine for dependencies that are written only in Python.

Where things get more complicated is when you want to use a dependency that includes native code.

I use a Mac. If I run pip install -t lib -r requirements.txt I'll get the Mac versions of those dependencies.

But AWS Lambda functions run on Amazon Linux. So we need to include version of our packages that are built for that platform in our zip file.

I first had to do this because I realized the python3.9 Lambda runtime includes a truly ancient version of SQLite - version 3.7.17 from 2013-05-20.

The pysqlite3-binary package provides a much more recent SQLite, and Datasette uses that automatically if it's installed.

I figured the best way to do this would be to run the pip install command inside an Amazon Linux Docker container. After much head scratching, I came up with this recipe for doing that:

docker run -t -v $(pwd):/mnt \
  public.ecr.aws/sam/build-python3.9:latest \
  /bin/sh -c "pip install -r /mnt/requirements.txt -t /mnt/lib"
  • The -v $(pwd):/mnt flag mounts the current directory as /mnt inside the container.
  • The public.ecr.aws/sam/build-python3.9:latest image is the official AWS Lambda Python 3.9 image.
  • /bin/sh -c "pip install -r /mnt/requirements.txt -t /mnt/lib" runs pip install inside the container, but ensures the files are written to /mnt/lib which is the lib folder in our current directory.

This recipe works! The result is a lib/ folder full of Amazon Linux Python packages, ready to be zipped up and deployed.

Running an ASGI application

I want to deploy Datasette.

Datasette is an ASGI application.

But... AWS Lambda functions have their own weird interface to HTTP - the event and context parameters shown above.

Mangum is a well regarded library that bridges the gap between the two.

Here's how I got Datasette and Mangum working. It was surprisingly straight-forward!

I added the following to my requirements.txt file:

datasette
pysqlite3-binary
mangum

I deleted my lib folder:

rm -rf lib

Then I ran the magic incantation from above:

docker run -t -v $(pwd):/mnt \
  public.ecr.aws/sam/build-python3.9:latest \
  /bin/sh -c "pip install -r /mnt/requirements.txt -t /mnt/lib"

I added the dependencies to a new function.zip file:

rm -rf function.zip
(cd lib; zip ../function.zip -r .)

Then I added the following to lambda_function.py:

import asyncio
from datasette.app import Datasette
import mangum


ds = Datasette(["fixtures.db"])
# Handler wraps the Datasette ASGI app with Mangum:
lambda_handler = mangum.Mangum(ds.app())

I added that to the zip:

zip function.zip lambda_function.py

Finally, I grabbed a copy of the standard Datasette fixtures.db database file and added that to the zip as well:

wget https://latest.datasette.io/fixtures.db
zip function.zip fixtures.db

The finished function.zip is 7.1MB. Time to deploy it:

aws lambda update-function-code \
  --function-name lambda-python-hello-world \
  --zip-file fileb://function.zip

This did the trick! I now had a Datasette instance running on Lambda: https://fnkvspusjrl5dxytaxnuwidxem0hverw.lambda-url.us-east-1.on.aws/

The default Lambda configuration only provides 128MB of RAM, and I was getting occasional timeout errors. Bumping that up to 256MB fixed the problem:

aws lambda update-function-configuration \
  --function-name lambda-python-hello-world \
  --memory-size 256

This should work for Starlette and FastAPI too

Mangum works with ASGI apps, so any app built using Starlette or FastAPI should work exactly the same way.

Pretty URLs

One thing I haven't figured out yet is how to assign a custom domain name to a Lambda function.

I understand doing that involves several other AWS services, potentially API Gateway, CloudFront and Route53. I'll update this once I figure those out.

19 Sep 16:02

Attention is not a commodity

by Doc Searls

In one of his typically trenchant posts, titled Attentive, Scott Galloway (@profgalloway) compares human attention to oil, meaning an extractive commodity:

We used to refer to an information economy. But economies are defined by scarcity, not abundance (scarcity = value), and in an age of information abundance, what’s scarce? A: Attention. The scale of the world’s largest companies, the wealth of its richest people, and the power of governments are all rooted in the extraction, monetization, and custody of attention.

I have no argument with where Scott goes in the post. He’s right about all of it. My problem is with framing it inside the ad-supported platform and services industry. Outside of that industry is actual human attention, which is not a commodity at all.

There is nothing extractive in what I’m writing now, nor in your reading of it. Even the ads you see and hear in the world are not extractive. They are many things for sure: informative, distracting, annoying, interrupting, and more. But you do not experience some kind of fungible good being withdrawn from your life, even if that’s how the ad business thinks about it.

My point here is that reducing humans to beings who are only attentive—and passively so—is radically dehumanizing, and it is important to call that out. It’s the same reductionism we get with the word “consumers,” which Jerry Michalski calls “gullets with wallets and eyeballs”: creatures with infinite appetites for everything, constantly swimming upstream through a sea of “content.” (That’s another word that insults the infinite variety of goods it represents.)

None of us want our attention extracted, processed, monetized, captured, managed, controlled, held in custody, locked in, or subjected to any of the other verb forms that the advertising world uses without cringing. That the “attention economy” produces $trillions does not mean we want to be part of it, that we like it, or that we wish for it to persist, even though we participate in it.

Like the economies of slavery, farming, and ranching, the advertising economy relies on mute, passive, and choice-less participation by the sources of the commodities it sells. Scott is right when he says “You’d never say (much of) this shit to people in person.” Because shit it is.

Scott’s focus, however, is on what the big companies do, not on what people can do on their own, as free and independent participants in networked whatever—or as human beings who don’t need platforms to be social.

At this point in history it is almost impossible to think outside of platformed living. But the Internet is still as free and open as gravity, and does not require platforms to operate. And it’s still young: at most only decades old. In how we experience it today, with ubiquitous connectivity everywhere there’s a cellular data connection, it’s a few years old, tops.

The biggest part of that economy extracts personal data as a first step toward grabbing personal attention. That is the actual extractive part of the business. Tracking follows it. Extracting data and tracking people for ad purposes is the work of what we call adtech. (And it is very different from old-fashioned brand advertising, which does want attention, but doesn’t track or target you personally. I explain the difference in Separating Advertising’s Wheat and Chaff.)

In How the Personal Data Extraction Industry Ends, which I wrote in August 2017, I documented how adtech had grown in just a few years, and how I expected it would end when Europe’s GDPR became enforceable starting the next May.

As we now know, GDPR enforcement has done nothing to stop what has become a far more massive, and still growing, economy. At most, the GDPR and California’s CCPA have merely inconvenienced that economy, while also creating a second economy in compliance, one feature of which is the value-subtract of websites worsened by insincere and misleading consent notices.

So, what can we do?

The simple and difficult answer is to start making tools for individuals, and services leveraging those tools. These are tools empowering individuals with better ways to engage the world’s organizations, especially businesses. You’ll find a list of fourteen different kinds of such tools and services here. Build some of those and we’ll have an intention economy that will do far more for business than what it’s getting now from the attention economy, regardless of how much money that economy is making today.

19 Sep 16:01

iPhone 14 Pro’s Lightning connection limited to USB 2.0 speeds, despite larger 48MP ProRaw support

by Steve Vegvari

Apple has upgraded iPhone 14 Pro and iPhone 14 Pro Max's rear camera array to support 48-megapixel ProRAW photography. As such, the Pro model iPhones can now capture detailed-rich photos, though at much larger file sizes.

However, despite these high-res image files coming in at roughly 75MB each, Apple's iPhone 14 Pro still features USB 2.0 transferring speeds. This means that the iPhone 14 Pro and iPhone 14 Pro Max are capped at data transfer speeds of around 480Mbps, the same as previous iPhone models.

With this in mind, users expecting to transfer several full-resolution 48-megapixel ProRaw photos or 8K video from an iPhone 14 Pro to a Mac, PC, or other devices, will be waiting quite some time. To mitigate some of the wait time, Apple recommends using iCloud Photos to access ProRAW photos on another device. Alternatively, you can use AirDrop to transfer large files from the smartphone wirelessly.

It's worth noting that although iPhone 14 Pro is limited to USB 2.0, Apple has experimented with USB 3.0 in the past. The original iPad Pro, which launched in 2015, supports USB 3.0. The use of the upgraded connector enables speeds of up to 5Gbps. It's unknown why Apple didn't integrate USB 3.0 in other products.

It's been reported in the past that Apple may be transitioning away from its Lightning port to USB-C next year. Claims that the iPhone 15 could soon offer USB-C connectivity have persisted for months. If accurate, the new connector would offer speeds of up to 10Gbps or 40Gbps through Thunderbolt 3.

Via: MacRumors