Shared posts

10 Nov 18:15

Memory capacity growth: a major contributor to the success of computers

by Derek Jones

The growth in memory capacity is the unsung hero of the computer revolution. Intel’s multi-decade annual billion dollar marketing spend has ensured that cpu clock frequency dominates our attention (a lot of people don’t know that memory is available at different frequencies, and this can have a larger impact on performance that cpu frequency).

In many ways memory capacity is more important than clock frequency: a program won’t run unless enough memory is available but people can wait for a slow cpu.

The growth in memory capacity of customer computers changed the structure of the software business.

When memory capacity was limited by a 16-bit address space (i.e., 64k), commercially saleable applications could be created by one or two very capable developers working flat out for a year. There was no point hiring a large team, because the resulting application would be too large to run on a typical customer computer. Very large applications were written, but these were bespoke systems consisting of many small programs that ran one after the other.

Once the memory capacity of a typical customer computer started to regularly increase it became practical, and eventually necessary, to create and sell applications offering ever more functionality. A successful application written by one developer became rarer and rarer.

Microsoft Windows is the poster child application that grew in complexity as computer memory capacity grew. Microsoft’s MS-DOS had lots of potential competitors because it was small (it was created in an era when 64k was a lot of memory). In the 1990s the increasing memory capacity enabled Microsoft to create a moat around their products, by offering an increasingly wide variety of functionality that required a large team of developers to build and then support.

GCC’s rise to dominance was possible for the same reason as Microsoft Windows. In the late 1980s gcc was just another one-man compiler project, others could not make significant contributions because the resulting compiler would not run on a typical developer computer. Once memory capacity took off, it was possible for gcc to grow from the contributions of many, something that other one-man compilers could not do (without hiring lots of developers).

How fast did the memory capacity of computers owned by potential customers grow?

One source of information is the adverts in Byte (the magazine), lots of pdfs are available, and perhaps one day a student with some time will extract the information.

Wikipedia has plenty of articles detailing cpu performance, e.g., Macintosh models by cpu type (a comparison of Macintosh models does include memory capacity). The impact of Intel’s marketing dollars on the perception of computer systems is a PhD thesis waiting to be written.

The SPEC benchmarks have been around since 1988, recording system memory capacity since 1994, and SPEC make their detailed data public :-) Hardware vendors are more likely to submit SPEC results for their high-end systems, than their run-of-the-mill systems. However, if we are looking at rate of growth, rather than absolute memory capacity, the results may be representative of typical customer systems.

The plot below shows memory capacity against date of reported benchmarking (which I assume is close to the date a system first became available). The lines are fitted using quantile regression, with 95% of systems being above the lower line (i.e., these systems all have more memory than those below this line), and 50% are above the upper line (code+data):

Memory reported in systems running the SPEC benchmark on a given date.

The fitted models show the memory capacity doubling every 845 or 825 days. The blue circles are memory that comes installed with various Macintosh systems, at time of launch (memory doubling time is 730 days).

How did applications’ minimum required memory grow over time? I have a patchy data for a smattering of products, extracted from Wikipedia. Some vendors probably required customers to have a fairly beefy machine, while others went for a wider customer base. Data on the memory requirements of the various versions of products launched in the 1990s is very hard to find. Pointers very welcome.

09 Oct 02:38

App Store and Congress

For a few months I’ve been trying to avoid the App Store as a topic.

I have — we all have — enough to be upset about these days, and I just don’t need this as another thing that’s eating me up. So I decided, once bigger and louder players got involved, that I could leave it to them and try to ignore the topic. Not let it get to me.

And now there’s a report from a House Judiciary subcommittee on this topic. The report quotes this blog twice (on pages 341 and 350) — and I am very proud.

It means I can stop banging my head against this particular wall — Congress is looking at it. I stand by ready to help, but I think it’s well beyond me now, and I can otherwise let this go. For the sake of my own health.

09 Oct 02:38

YouTube Music lifts casting restrictions on free users

by Brad Bennett

As YouTube Music struggles to get attention from users, it’s lifting the previous music casting restriction on personal music.

One of YouTube Music’s secret features is that it allows users to upload their own personal music collection to the service’s cloud locker. This means you can make playlists with music from your personal collection along with songs from YouTube Music’s library.

This feature gives users a lot more music listening freedom, but until today, people who used the free version of YouTube Music couldn’t cast the uploaded music to smart speakers. Now iOS and Android users can cast their own music without paying Google’s monthly fee.

That said, if you try to cast a song from YouTube Music’s library, it will block you and say, “Only uploaded music can play on your speakers.” It’s also worth noting that Google will remove any streaming songs from your YouTube Music queue if they’re mixed in with your personal library.

While this is a big upgrade for YouTube Music users with large libraries, it’s a bit odd that Google is still restricting free users from casting free music. For instance, if you ask a Google smart speaker to play music from YouTube Music, it will do it for free with ads and six skips per hour.

It seems that Google is rolling out this new personal library casting feature to appease people who’ve recently transitioned from Google Play Music since the old service could do this easily. In the future, I wouldn’t be surprised if Google allowed people to cast any music from YouTube Music but then turned it into a song-based radio station with ads and other restrictions.

Source: 9to5Google

The post YouTube Music lifts casting restrictions on free users appeared first on MobileSyrup.

06 Oct 22:29

New audio products from Apple inbound

by Volker Weber

38818ea9e04cee0c43ea6fe7094c3917

HomePods and headsets have replaced my usage of Sonos. Even more so since we are all working from home. There are three areas where Apple does not yet provide everything I need: more affordable HomePods as Siri access points, over-ear headphones and TV sound.

Third party audio solutions from Bose, Sonos et. al. have disappeared from Apple stores. They are making space for new products. This shall be interesting.

06 Oct 22:28

Livegrep

Link: https://github.com/livegrep/livegrep

“Instantly grep all code at your company from a web interface, with it producing the relevant files, contextual snippets, and links to the full version in Github/GHE/etc.”

[[livegrep/livegrep::https://github.com/livegrep/livegrep]]

When folks ask me a question about our codebase internally I try to

a) answer the question b) say “If I were trying to find the answer to that question with our tools, here’s my entry point, here’s the search query, and here’s my mental heuristic for why I’d click on result #3”

This is trying to thread the needle on always, always being happy to give responsive answers to questions while also increasing people’s ability to self-serve on them in the future.

Since it’s publicly available, let me mention that the most common tool I use for answering these questions is livegrep and that I intend to boot up a livegrep instance on the first day of every startup for the rest of my life.

It borders on miraculous.

via [[Patrick McKenzie]] [[@patio11::https://twitter.com/patio11/status/1086112277659451392]]

06 Oct 22:23

Naar wens

by Lilia

“Naar wens” (as intended, Dutch) said a father of fifteen years old homeschooler when I told him how cool it was to see his son facilitating a workshop for other kids. There was a mix of pride and joy in his words and a sense of an accomplishment with lots of hard work and uncertainties behind.

This is how I felt six years later getting comments on Alexander’s work as a guide on the Shtandart, a replica of Peter the Great ship. We both spent four days there giving guided tours as part of a bigger program organised by Zaans Erfgoed, a historical organisation in Zaandam, where Tzar Peter stayed incognito 1697 while learning about shipbuilding.

The Shtandart is a training ship and Alexander sailed with it as part of the crew in 2019 and 2020. He also went to help with the maintenance in the dry dock where I ended up joining as well. Speaking Russian, English and Dutch proved to be very useful given Russian ship, an international crew in a Dutch-speaking environment, so this is how we eventually came to volunteer in Zaandam.

In Zaandam, we joined the crew as well. Although we were spared the galley duties for those days, we did the other ship essentials, were leading small groups of people in masks around the ship and served as a linking pin between he professional crew of the ship and the volunteers of Zaans Erfgoed who were more comfortable in Dutch than English.

It was a bit difficult not to worry about all the imperfections of Alexander’s tours as well as my own level of Dutch and knowledge of the historical ship, but it was a great experience to see that our work added a lot of value. Alexander was at his place there – climbing the ropes in the morning, standing dressed up on the street, telling the story to the visitors, chatting with volunteers or helping them with setting up the scene, convincing the crew to move a heavy cannon into a more engaging position or shooting video with a journalist.

Going against the grain of how things are done in the society with homeschooling is full of doubts, so the moments like that are very gratifying. Seeing how legitimate peripheral participation moves from ‘peripheral’ as with the archeologists a while ago to full-scale productive participation is also a milestone worth celebrating.

 

The post Naar wens appeared first on Mathemagenic.

06 Oct 22:23

The Library

by peter@rukavina.net (Peter Rukavina)

I have, slowly but surely, been cleaning up the room on our first floor variously known, over the years, as “the office,” “the situation room,” and ”the library.”

Catherine designed the mantlepiece and the shelving, which included a liquor cabinet complete with its own light. Last fall, around this time, when she could no longer navigate the stairs to the second floor, she moved her bedroom here, and so it was, for a time, “Catherine’s room.”

Some weeks ago Oliver and I somehow managed to wrangle the couch, from Catherine’s studio, across the street and into the house; it fits the room well, and arrived just in time to serve as a makeshift bed for our friend Yvonne, who visited this weekend from Halifax. Her visit was all I needed to make the last push toward cleaning the room up: I loaded up a Kia Soul’s worth of various and sundry and dropped it at the thrift shop, dusted and vacuumed, and generally got things ship-shape.

Which allowed me to open the curtains for the first time in a long time.

And to discover that the room gets wonderful sun in the afternoon.

The Library at 100 Prince Street

06 Oct 22:11

Samsung Galaxy Note 8 will no longer receive monthly security patches

by Dean Daley

Samsung’s Galaxy Note 8 will no longer get monthly updates, but it’s not in as bad a spot as fellow 2017 flagship the Pixel 2, which officially reached its end of life. While the Pixel 2 line will get its last update in December, the Note 8 will get quarterly updates for another year.

After over three years of over-the-air updates, Note 8 users will no longer get month-to-month security patches. Instead, those updates will arrive in quarterly bundles.

Unfortunately, the Note 8 never received the Android 10 update and is stuck at Android 9. Samsung now promises that its flagship lines will get three major OS upgrades, but that only includes devices after the S10.

Similar to the S8 series a few months back, Samsung moved the Note 8 to its ‘Quarterly Security Updates’ program.

If you own a Note 8 you’ll continue to see patches and updates quarterly until October 2021. After this date, it’s probably best that you get a new device, and even with only quarterly patches, your phone won’t be as safe month-to-month. Still, that’s better than the Pixel 2 no updates at all.

Source: 9to5Google

The post Samsung Galaxy Note 8 will no longer receive monthly security patches appeared first on MobileSyrup.

06 Oct 22:11

The Best Audiophile Headphones for Everyday Use

by Lauren Dragan and Brent Butterworth
Our picks for the Best Audiophile Headphones for Everyday Use.

If what you seek in a pair of headphones is the best sound possible for serious listening sessions, but you don’t want to drop thousands of dollars, this is the guide for you. We’ve tested hundreds of audiophile headphones under $1,000 and picked our favorites for different uses.

Most of these over-ear headphones forgo features such as Bluetooth and active noise cancellation and just focus on delivering a great sonic experience.

Dismiss
06 Oct 22:10

Whenever Trump talks about being a ‘leader’, it’s not the draft-dodging that springs to mind first. It’s John McCain refusing an early release from jail in the Vietnam War until his fellow prisoners were freed. I find it almost unbelievably noble. Trump, of course, maligned him.

by James O'Brien (mrjamesob)
mkalus shared this story from mrjamesob on Twitter.

Whenever Trump talks about being a ‘leader’, it’s not the draft-dodging that springs to mind first. It’s John McCain refusing an early release from jail in the Vietnam War until his fellow prisoners were freed. I find it almost unbelievably noble. Trump, of course, maligned him.




2490 likes, 390 retweets
06 Oct 22:10

This is categorically *not* my field of expertise - and I'm almost certainly behind the times - but what an amazing coincidence that one purveyor of antivirus software has 'discovered' that googling the name of another purveyor of antivirus software is a 'massive security risk'.

by James O'Brien (mrjamesob)
mkalus shared this story from mrjamesob on Twitter.

This is categorically *not* my field of expertise - and I'm almost certainly behind the times - but what an amazing coincidence that one purveyor of antivirus software has 'discovered' that googling the name of another purveyor of antivirus software is a 'massive security risk'.




418 likes, 43 retweets
06 Oct 22:03

The Abyss of Ignorable: A Route into Chaos Testing from Starling Bank

Greg Hawkins, InfoQ, Oct 06, 2020
Icon

This article is about maintaining cloud servers but contains an important lesson for educators as well, because educators often approach teaching as though they are in some way 'programming' their students. What happens when things randomly change? "We rightly dismiss the absurd and handle the mundane. However, human judgement fails quite badly when it comes to unlikely events.... Developers have generally been poor error handlers since development began." What you need to do is build in the likelihood of random failure from the outset. That's what Starling Bank did - they wrote a script that would randomly crash one of their cloud servers. The result was that developers, knowing that this would happen, had to build resilience into their systems. For educators, this is like building scenarios into their teaching such that whatever the student has learned will fail. How do they then respond? That's where real teaching begins.

Web: [Direct Link] [This Post]
06 Oct 22:02

The Best USB Hubs

by Nick Guy
Three Anker USB hubs laying next to each other on an orange background. The hub on right is plugged into a laptop.

Modern computers are thinner and lighter than ever before, but sleek design comes at a cost: The number and types of ports on everything from MacBooks to all-in-one desktops have ended up on the chopping block. If you want to plug in anything with a connection other than a USB-C cable, the most efficient and economical option is to use a USB-C hub, which provides multiple types of ports in a relatively compact package.

Our favorite hub is Anker’s 555 USB-C Hub (8-in-1). In addition to having all the right ports, it’s one of the very few hubs with an HDMI port that supports a proper 60 Hz refresh rate on 4K monitors so that animations and movements on screen are nice and smooth.

Dismiss
06 Oct 21:56

Facebook brings Netflix and Zoom to its Portal devices

by Brad Bennett

Facebook is rolling out updates to its Portal TV and Portal smart display devices to bring Netflix and Zoom to the platform.

If you had a Portal TV device, you’re already available to watch Prime Video and Crave on the service, but now, Netflix is joining the party. Facebook is even adding a new remote to the Portal TV with dedicated Netflix Buttons like the new Google Chromecast remote.

To be clear, it appears that Netflix is only coming to Portal TV. That said, the regular Portal smart displays are getting Zoom. The popular video chatting service is even taking advantage of the smart displays’ AI camera that follows you around the room.

There are a few other small updates to Facebook’s lagging smart display platform, but for the most part, it’s just becoming a little more functional. The social network also mentions that Zoom will come to Portal TV in the future.

Source: Facebook

The post Facebook brings Netflix and Zoom to its Portal devices appeared first on MobileSyrup.

06 Oct 21:44

A thought-provoking read

by Lilia

A striking aspect of actual descriptions of Dutch society in the pre-war and war years is that the Dutch were so law-abiding. Bureaucratic processes are characterized by formalism, conscientious devotion to duty, and obedience in general and even more so in The Netherlands since the 1960s. The Dutch appear to have had total faith in the authorities, blindly following the orders of superiors. The pillarized Netherlands was a class society with hierarchical interpersonal relations. In the nineteenth and early twentieth century, there was no tradition of resistance or distrust of the authorities in the Netherlands, and unlike the Italians, French or Belgians, the Dutch were unlikely to resourcefully sabotage orders from above. The character structure of public servants in this kind of rigid social order is such that they carry out orders without posing any moral questions as to their legitimacy.

Bovenkerk, F. The other side of the Anne Frank story: The Dutch role in the persecution of the Jews in World War Two. Crime, Law and Social Change 34, 237–258 (2000). https://doi.org/10.1023/A:1008303512172 (full-text access)

I came across this article while looking for more background information for a FB discussion about the reasons for a relatively big percentage of Jews deported from the Netherlands during WWII (76% compared to 30% in Belgium and 25% in France).

I am glad that the inquisitive friend who started the discussion continued to ask “but why?” after my first comment:

Partly due to the good ID registration system (https://jck.nl/en/exhibition/identity-cards-and-forgeries). This is something that was first brought to my attention in the discussions of the ethics of data collection and analysis in computer science research. From that perspective, there is also a whole story of IBM role in automatisation during the Holocaust, but it has less to do with the Netherlands.

The discussion that unfolded there (and in my friends-only FB post on it) goes into a  bunch of cultural issues that I find very relevant today, given technological developments paired with the strengthening state controls due to coronavirus and polarisation due to BLM events. These are thorny things to write about (especially for a ‘resident foreigner’ like me, which is another thorny thing), so I leave it like that for a time being.

The post A thought-provoking read appeared first on Mathemagenic.

06 Oct 06:14

The Database is on Fire

cover image

Let me tell you a tech story.

Many years ago, I was working on an app with real-time collaboration built into it. It was a neat experimental stack, using early React and CouchDB to their full potential. It synced data live over JSON OT. It was used to do actual work internally, but the general applicability and potential for re-use was evident.

Trying to sell potential customers on this technology, we ran into an unexpected wall. Our tech looked and worked great in the demo video, no problem there. It showed exactly how it worked, and nothing was faked. We had come up with and coded a credible field scenario.

Two users interacting through a mobile app

That was in fact the problem. Our demo worked exactly the way everyone else always pretended their apps worked. Namely that information would flow instantly from A to B, even large media files. That everyone would see new records as they came in. That by adopting an app, users could just work together without friction on the exact same things, even if they had a spotty internet connection in the middle of nowhere. This is the sort of thing every After Effects-product-video does implicitly.

Despite the fact that everyone knew what a Refresh button was for, it didn't register at all that the web apps they were asking us to build would normally be subject to its limitations. Or that it would be an entirely different experience if you stopped needing it altogether. What they mainly noticed was that you could "chat" by leaving notes between people, so they wondered how this was different from, say, Slack. Oof.

The Design of Everyday Syncs

If you've been working in software for a while, it can be a bit jarring to remember that most people cannot look at a picture of an interface and infer what it will do when they interact with it. Let alone know what's going to happen behind the scenes. Knowing what can happen is largely a result of knowing what cannot happen, or should not happen. It requires a mental model not just of what the software is doing, but how its different parts are arranged and can communicate.

A classic example is the user who is staring at a spinner.gif for 20 minutes, wondering when the work will finally be done. The developer would know the process is probably dead and that the gif will never go away. Its animation mimes that work is happening, but it's not connected to the state of the work. This is the sort of thing where some techies like to roll their eyes at just how misinformed a user can be. Notice though, which one of them is pointing at a spinning clock and saying it's actually standing still?

An animated activity spinner

This is the value proposition of real-time in a nutshell. These days real-time databases are still mostly underused, and still regarded with suspicion. Most of them lean heavily into NoSQL style, which tend to bring up Mongo-fueled benders people would rather forget. For me though, it meant getting comfy on the CouchDB, as well as learning to design schemas that somebody other than a bureaucrat could correctly fill in. I think my time was better spent.

The real topic of this post is what I'm using these days. Not by choice, but through uncaring and blindly applied corporate policy. Hence a Totally Fair and Unbiased comparison of two closely related Real-Time Database Products By Google™.

Firebase Real-Time Database + Firebase Cloud Firestore = Firebase

Anyone need more stickers?

They both have fire in their name. One I regard with fondness. The other is a different kind of fire. If I'm hesitant to name them, it's because once I do, you will run into the first big problem, their names.

One is called the Firebase Real-Time Database and the other is the Firebase Cloud Firestore. They are both products in Google's Firebase suite. The APIs are respectively firebase.database(…) and firebase.firestore(…).

This is because Real-Time Database is just the original Firebase, before Google bought it in 2014. They then decided to build a copy of Firebase on their big data whatsit, as a parallel offering, and named it Firestore with a cloud. Hope you're still following. If not, no worries, I have tried to rewrite this part a dozen times already.

Cos really you have a Firebase in your Firebase, and a Firestore in your Firebase, or at least you do if you want to be able to make sense of Stack Overflow from a few years ago.

If there is a Raspberry award for worst naming in software products, this surely qualifies. The mental hamming distance between these names is so small it will trip up even seasoned engineers, whose fingers will type one thing as their lips speak another. It will royally fuck up plans made with the best of intentions, fulfilling the prophecy of having your database be on fire. I'm not joking in the slightest. The person who came up with this naming scheme has blood, sweat and tears on their hands.

I will call them Firebase⚾️ and Firestore🧯. The baseball is for Firebase, and the fire extinguisher is for Firestore, because you need one.

A man chasing bugs with a butterfly net in a forest that's on fire

Pyrrhic Victory

You might think Firestore🧯 is the replacement for Firebase⚾️, a next-generation successor, but that would be false. Firestore🧯 is patently unsuitable as a Firebase⚾️ replacement. Somebody has seemingly engineered away almost everything that is interesting about it, and messed up most of the rest in various ways.

A cursory glance at the two offerings will confuse you though, because it seems they do the same thing, through mostly the same APIs and even the very same DB session. The differences are subtle and will only be discovered after careful comparative study of the sparse docs. Or when you're trying to port code that works perfectly fine on Firebase⚾️ over to work with Firestore🧯. When you discover that your database interface catches fire as soon as you try to do a mouse drag in real-time. Again, not kidding.

Firebase⚾️'s client is polite in that it buffers changes and auto-retries with last-write-wins best-effort consistency. Firestore🧯 however has a throttle of 1 write per document per user per second, enforced by the server. It's up to you to implement a rate limiter around it yourself when you use it, even though you're just trying to build your app. So Firestore🧯 is a real-time database without a real-time client, wearing the API of one as a skin suit.

This is also where you start to see the first hints of the reason for Firestore🧯's existence. Those words "best-effort consistency" tend to make Grumpy Database Admins' few remaining hairs stand up. I could be wrong about this, but I suspect that somebody high up enough at Google looked at Firebase⚾️ after they bought it and simply said: "No. Dear God, No. This is unacceptable. Not on my watch."

A gray beard monk reading from a tome

They emerged from their chambers and declared:

"One big JSON document? No. You will split your data into separate documents, each no more than 1MB in size."

This seems like a limit that will not survive first encounter with any sufficiently motivated user base. You know it's true. Like, we have a few 1500+ slide presentations at my current job, which is Perfectly Normal.

Under this constraint, you will be forced to accept that one "document" in your database bears no resemblance to anything a user might refer to as a document. This is the kind of thing that will cause subtle misery from day 1 in attempting to match end-user requirements, and never cease.

"Arrays of arrays that can contain other things recursively? No. Arrays will contain only objects, or numbers of a fixed size, as God intended."

So if you were hoping to put GeoJSON in your OnFirestore🧯, you will discover that's not possible. With anything remotely not 1-dimensional. Hope you like Base64 and/or JSON inside JSON.

"Import and Export JSON via HTTP, command line tools or the admin panel? No. You will only export and import data to and from Google Cloud Storage. At least I think that's what it's called these days. When I say 'you', I am only talking to those of you with Project Owner permissions. Everybody else go away and file a ticket."

You see, FireBae🤵⚾️'s data model is straightforward to describe. It contains one big enormous JSON document, which maps JSON keys to URL paths. If you HTTP PUT this to your ⚾️ at /:

{
  "hello": "world"
}

Then GET /hello will return "world". For the most part this works as you'd expect. A ⚾️ collection of objects /my-collection/:id is equivalent to a JSON dictionary {"my-collection": {...}} at the root, whose contents are available at /my-collection:

{
  "id1": {...object},
  "id2": {...object},
  "id3": {...object},
  // ...
}

This works fine as long as each insert has a non-colliding ID, which you are provided a standard solution for.

In other words, ⚾️ is 100% JSON compatible (*) and plays nice with HTTP, like CouchDB. But you mostly use it through its real-time API, which abstracts away the websockets, auth and subscriptions. The admin panel does both, allowing real-time edits and JSON import/export. If you embrace the same in your code, it's quite astonishing how much specialized code disappears once you realize JSON patch and diff can solve 90% of your persistent state handling chores.

🧯's data model is JSON-like, but diverges in a few critical ways. No arrays of arrays was already mentioned. The model for sub-collections is to have them be first class concepts, separate from the JSON document holding them. Because there is no readily available serialization for this, getting data in and out requires a specialized code path. You must build your own scripts and tools to handle your own collections. The admin panel only allows you to make small edits, one field at a time, and does not have any import/export.

They took a real-time NoSQL database and turned it into a slow, auto-joined not-SQL with a dedicated not-JSON column. Something something GraftQL.

A developer sitting at a computer that's on fire

Hot Java

If 🧯 was supposed to be more reliable and scalable, then the irony is that the average developer will end up with a less reliable solution than if they had just used ⚾️ out of the box. The kind of software that Grumpy Database Admin has designed for requires a level of effort, and caliber of engineering, that just isn't realistic for the niche they are supposedly good at. It's similar to how HTML5 Canvas is not at all a Flash substitute, if the tools and player aren't there. What's more, it is steeped in an attitude towards data purity and sterile validation that simply doesn't match how the average business user actually likes to get work done, which is that everything is optional because everything is a draft until the very end.

The main flaw in ⚾️ is that the client was created a few years too soon, before you could reasonably expect most web developers to know what immutability was. As a result, ⚾️ assumes you are mutating your data, and will not benefit from any immutability you feed it. It will also not reuse data in the snapshots it gives you, making diffs much harder. For larger documents, its mutable-diff based transactional mechanism is just inadequate. C'mon guys, we have WeakMap in JavaScript now. It's cool.

If you give your data the right shape, and keep your trees shallow, you can work around this. But I do wonder whether ⚾️ wouldn't get a whole lot more interesting to people if they just released a really good client API based on leveraging immutability, coupled with serious practical advice on schema design. Instead it's like they tried to fix what wasn't broken, and made it worse.

A developer on fire

I don't know the full intent behind building 🧯. Speculating about motives inside the Borg cube is part of the fun. The juxtaposition of these two extremely similar but discapable databases is a relative rarity in the scene. It's almost as if somebody thought "⚾️ is just a feature we can emulate on <Google ☁️>" but hadn't yet discovered the concept of gathering real-world requirements, or coming up with useful solutions for all of them. "Let the developers worry about that. Just make the UI pretty... Can we add more fire?"

I do know a thing or two about data structures. I can certainly see that "everything is one big JSON tree" is attempting to abstract away any sense of large-scale structure from a database. Expecting the software to just cope with any kind of dubious schema fractal is madness. I really don't need to imagine how bad these things can get, for I have done hostile code audits, and seen things you wouldn't believe. But I also know what good schemas look like, how you can leverage them, and why you should. I can imagine a world where 🧯 seemed absolutely like the right thing to do and where the people who did it think they did a good job. But it's not the one I live in.

⚾️'s support for querying is poor by anyone's standards, borderline non-existent. It certainly could have used improvement, or even a rethink. But 🧯 isn't much better, as it is limited to the same 1 dimensional indices you get in basic SQL. If you want the kinds of queries humans like to do on messy data, that means full text search, multi-range filters, and arbitrary user-defined ordering. Anything that's basic SQL if you squint is itself too limited. Besides, the only SQL queries people can run in production are the fast ones. What you want is a dedicated indexing solution with grown up data structures. For everything else, there ought to at least be incremental map-reduce or something.

If you ask Google's docs about this, you will be helpfully pointed in the direction of things like BigTable and BigQuery. All these solutions however come with so much unadulterated enterprise sales speak, that you will quickly turn back and look elsewhere.

The last thing you need with a real-time database is something made by and for people who work on an executive schedule.

(*) Just kidding, there is no such thing as 100% JSON compatible.

06 Oct 06:10

The extremely weird story of a remote-learning company that’s making parents livid

Jared Newman, Fast Company, Oct 05, 2020
Icon

The story here isn't so much the technology company in question (though I must say its design choices are deeply disturbing) and the deeper issues regarding online content for young students. It's "about schools that failed to fully vet and understand the remote-learning platforms they were deploying as they raced to bring school online," writes Jared Newman. "Faced with extraordinarily difficult circumstances, some schools inevitably made some bad calls." Having content vetted by the wider community would address this issue. But this doesn't work with closed proprietary content.

Web: [Direct Link] [This Post]
06 Oct 06:09

The finger

by nobody@domain.com (Cal Henderson)

Bert was getting tired of tapping his yubikey, so rigged up a system to do it for him:

I showed this to someone and they said, “So.. you built a button that you press that will press a button? Why not just press the button?” which was a bit infuriating because they clearly missed the whole point. “Don’t you get it? This button BAD, but this button GOOD. Me want to press GOOD button.”

06 Oct 06:05

“If I Had More Resources…”

by Richard Millington

…you would have fewer excuses.

Blaming a lack of engagement on a lack of resources is great, no-one can ever prove you wrong. But that doesn’t mean you’re right.

There are thousands of thriving communities founded by passionate amateurs today who had far fewer resources than you.

The problem is rarely a lack of resources, but often a lack of ingenuity and permission.

Ingenuity matters a lot. If you don’t have the right community concept, can’t find clever ways to keep members engaged, or foster a desire in members to invite others, you’re going to struggle. None of these things are resource-intensive.

Often the problem isn’t a lack of resources but a lack of permission. Permission to do things that need to be done has a far bigger impact on participation than money. If you don’t have permission to promote your community to your customer base, integrate it with the product/service experience, or use the right platform, that can be a killer. And permission is about credibility.

Demonstrate what you can do with what you have. Then ask for more. Asking people to put more resources into a struggling project is never a winner. If you can’t drive a good level of engagement with limited resources, you’re not going to do much better with more resources.

You need to drive plenty of engagement before getting more resources. Companies want to back winners, not losers.

Remember also the kill zone is real. The bigger your community budget, the bigger target it becomes during hard times.

06 Oct 06:05

HyperBlocks come to Snap! — UX for PX in CS4All

by Mark Guzdial

Jens Moenig kindly shared with me a video announcing HyperBlocks that he’s added to the next version of Snap! The idea of hyperblocks is to support vector and matrix operations in Snap!, as in APL or MATLAB.

I’m interested in the research question whether vector operations are easier or harder for students, including considering who the students are (e.g., does more math background make vector operations easier?) and how we define easier or harder (e.g., is it about startup costs, or the ability to build larger and more sophisticated programs?). My suspicion based on the work of folks like L.A. Miller, John Pane, Diana Franklin, Debbie Fields, and Yasmin Kafai is that vector operations would be easier. Students find iteration hard. Users have found it easier to describe operations on sets than to define a process which applies the operation to specific elements. It’s a fascinating area for future research.

And, you can do Media Computation more easily (as Jens shows) which is a real win in my book!

They also have an online course, on using Snap! from Media Computation to Data Science: https://open.sap.com/courses/snap2

Soon after Jens sent me this video, I got to see him do this in real-time at Snap!Con, and then he and Brian Harvey won the NTLS Education Leadership Award for their work on Snap! (see link here). Congratulations to them both!

So here’s the question that I wonder: Who does Snap! work for, and who doesn’t it?

  • I find Snap! fascinating but not usable for me. I have tried to do what I see Jens doing, but have to keep going back and forth from the video to the tool. It’s not obvious, for example, where to get the camera input and how to use it. I’m sure if I spent enough time using Snap!, I’d get it. What teachers and students are willing to pay that cost? Obviously, some are — Snap! is popular and used in many places. Who gets turned off to Snap!? Who doesn’t succeed at it?
  • I attended some of the sessions at Snap!Con this summer: https://www.snapcon.org/conferences/2020. I was particularly struck by Paul Goldenberg’s session. He showed videos of a young kid (somewhere in 8-10) using Snap!. He was struggling to place blocks with a trackpad. Think about it — press down at the right place, drag across the trackpad without lifting up, release at the right place. This is hard for young kids.

These are important questions to consider in pursuit of tools that enable CS for All. UX for PX – how do we design the user experience of the programming experience.

P.S. Jens just sent me the link to his Snap!Con talk video: https://youtu.be/K1qR4vTAw4w

06 Oct 06:04

Developer Roadmaps

Kamran Ahmed, Roadmap.sh, Oct 05, 2020
Icon

This is a really useful set of roadmaps describing what you need to learn if you are pursuing different career paths in web development - Frontend (ie., the side that runs on the browser), Backend (ie., the side that runs on the web server), and DevOps. It should be noted that this is only a high-level view; for each item in the diagram there is a constellation of frameworks, templates, and helper applications (you can get a sense of this by looking at the React roadmap on the website). But it's interesting to see just how far this discipline has grown since the days of cobbling together some HTML in a text editor and getting it to display in a web browser. I found this resource via the GitHub site but it's the same in both places.

Web: [Direct Link] [This Post]
06 Oct 06:03

Quoting Kevin Roose

I’ve often joked with other internet culture reporters about what I call the “normie tipping point.” In every emerging internet trend, there is a point at which “normies” — people who don’t spend all day online, and whose brains aren’t rotted by internet garbage — start calling, texting and emailing us to ask what’s going on. Why are kids eating Tide Pods? What is the Momo Challenge? Who is Logan Paul, and why did he film himself with a dead body? The normie tipping point is a joke, but it speaks to one of the thorniest questions in modern journalism, specifically on this beat: When does the benefit of informing people about an emerging piece of misinformation outweigh the possible harms?

Kevin Roose

06 Oct 05:55

They Got it Right: Our World in 2022

by Gordon Price

Via Daily Scot:

06 Oct 05:55

Excel spreadsheet limit leads to 16,000 Covid-19 cases left off daily count

by Nathan Yau

Microsoft Excel is useful for many things, but it has its limitations (like all software), which led to an undercount of 15,841 Covid-19 positive tests recorded by Public Health England. For the Guardian, Alex Hern reports:

In this case, the Guardian understands, one lab had sent its daily test report to PHE in the form of a CSV file – the simplest possible database format, just a list of values separated by commas. That report was then loaded into Microsoft Excel, and the new tests at the bottom were added to the main database.

But while CSV files can be any size, Microsoft Excel files can only be 1,048,576 rows long – or, in older versions which PHE may have still been using, a mere 65,536. When a CSV file longer than that is opened, the bottom rows get cut off and are no longer displayed. That means that, once the lab had performed more than a million tests, it was only a matter of time before its reports failed to be read by PHE.

The gut reaction seems to dunk on Excel, but the whole infrastructure sounds off. Excel wasn’t meant to handle that many rows of data, and as a non-Excel person, I think it’s been like that forever.

Why are these records manually entered and transferred to a database? Why is the current solution to work off this single file that holds all of the data?

I bet the person (or people) tasked with entering new rows into the database aren’t tasked with thinking about the data. Who eventually noticed no new records were recorded after a week?

Such important data. So many questions.

It’s not so much an Excel problem as it is a data problem, and what looked like downward trend was actually going up.

Tags: coronavirus, Excel, limitations

06 Oct 05:54

Google Pixel 2 series reaches end of life with last update in December

by Dean Daley

Google is rolling out the security updates for the Pixel phones from the 2 series to the 4a, but this is one of the last times for the company’s 2017 flagships.

The tech giant promises three years of security patches for each of its smartphones. So, as of today, the Pixel 2 series has officially reached its end-of-life, according to the original Google update calendar. However, the Menlo Park-based company has confirmed to Android Police that it will get its last update this December, including a final set of critical fixes from both November and December.

Anyone still using one of these phones might want to consider upgrading. While the Pixel 2 and 2 XL are still usable, they’ll be less secure as each month passes by after December.

Google has influenced other companies like Samsung to do three years of updates, since originally most Android-makers only pushed out two years of updates, including Google. Google actually switched over from two years of updates to three years when it launched the Pixel 2 back in 2017.

It’s worth noting that Apple continues to support its phones for five years of updates, so it’d be nice if Google did the same.

Source: Android Police

The post Google Pixel 2 series reaches end of life with last update in December appeared first on MobileSyrup.

06 Oct 05:54

29th

by peter@rukavina.net (Peter Rukavina)

On this night, around this time, 29 years ago, this woman asked if she could kiss me. The rest is history.

06 Oct 05:53

Softwareproblem. Da kann man nichts machen. In Großbritannien ...

mkalus shared this story from Fefes Blog.

Softwareproblem. Da kann man nichts machen. In Großbritannien sind angeblich 16k Covid-Fälle unter den Tisch gefallen, weil (festhalten!) "der Excel-Spreadsheet voll war".

Ja gut, wenn wir SO viele Fälle haben, dann funktioniert unsere Infrastruktur halt nicht mehr. Das konnte niemand voraussehen, dass eine Pandemie viele Fälle haben würde.

Wie? Warum wir Excel nehmen anstatt anständigen Toolings? Nun, äh, *papierraschel* oh, schon SO SPÄT? Ich muss weg! *flitz*

(Vorsicht: Quelle ist Daily Mail)

06 Oct 05:49

Bootstrap 5 Alpha 2

The Bootstrap Blog, Oct 06, 2020
Icon

Over the weekend I spent some time looking at modernizing the gRSShopper web interface. It currently runs on a combination of Bootstrap and Jquery. Over the last few weeks, however, a new version of Bootstrap has been released (in alpha) that does not require Jquery. That creates the possibility of a lighter and more responsive interface - but it also set me off on a bunch of side-projects that I'll cover over this week. One of these is the developer roadmap that I've also linked to today.

Web: [Direct Link] [This Post]
06 Oct 05:41

Surely the first time an hour of phone-in has been dedicated to the intricacies of spreadsheet software in the history of British radio - and my callers absolutely nailed it! twitter.com/ruskin147/stat…

by James O'Brien (mrjamesob)
mkalus shared this story from mrjamesob on Twitter.

Surely the first time an hour of phone-in has been dedicated to the intricacies of spreadsheet software in the history of British radio - and my callers absolutely nailed it! twitter.com/ruskin147/stat…

So now we've learned that it WAS an Excel foulup. Public Health England was using the old xls format that could only accommodate files with around 1400 tests (each one using several rows) - and a surge in positive tests caught them by surprise twitter.com/ruskin147/stat…




1316 likes, 446 retweets



2053 likes, 364 retweets
04 Oct 20:33

NewsBlur Blurblog: Emotional Consent

sillygwailo shared this story from Steve Pavlina.

Have you ever had the experience of someone venting their emotions onto you or roping you into an emotional conversation, and you never actually consented to sharing that kind of experience with them? How did that feel?

Did you ever feel used, abused, or drained by someone emotionally, especially someone who expected to be granted automatic access to your emotional resources? Did this encourage you to open your heart more to such people, or did you learn to keep your heart closed in such situations?

Just as you need physical consent for sharing physical intimacy with someone, consider that it’s also important to seek consent for engaging in emotional intimacy with someone. This includes something as basic as having an emotional conversation.

On the flip side, just as you may assert boundaries for physical intimacy, consider doing the same for emotional intimacy. Both are requests for your personal resources, and you have every right to deny someone access.

Just as no one is entitled to physical intimacy with you unless you consent to that, it’s wise to regard emotional intimacy in the same light.

Both forms of intimacy can be wonderful and rewarding. They can also be draining. And they are both risky in some situations. Being physically intimate with someone isn’t 100% safe. Nor is sharing emotional intimacy.

You get to make the choice of whom you let in and when. People aren’t entitled to automatic access to your personal resources.

This includes when someone is feeling physically or emotionally needy or entitled.

Emotional Boundary Management

Watch out for manipulative patterns here as well. For instance, if someone labels you emotionally dismissive, it’s similar to being labeled sexually frigid. In both cases there’s an assertion of entitlement to your personal resources (physical or emotional) when consent isn’t willingly offered.

Such labels may be used in an attempt to coerce you into changing your attitude or behavior for the benefit (self-interest) of the person using the labels. Either way it’s a form of nonconsensual emotional manipulation, an attempt to control you when you don’t willingly consent. Don’t fall for it.

Just as you’re free to grant or not grant someone access to your body, you’re also free to decline access to your heart – and your mind for that matter, like when someone feels they’re entitled to free tech advice from you just because you know more about computers.

No one can assert the right to emotional validation from you any more than they can assert a right to have sex with you, regardless of how needy they feel. Someone can only be granted access to physical or emotional intimacy with you by your consent, and if you withhold that consent, then no means no.

What if you’re the one seeking physical or emotional intimacy with someone else? You can invite and offer that type of connection, but don’t assume that you’re entitled to it. Give the other person a chance to decline. Don’t be emotionally creepy or rapey.

If someone declines to grant such consent, you aren’t entitled to coerce or manipulate them into doing what you want. If you find such a relationship unsatisfying, you do have the option to disengage from the relationship and get your physical and emotional needs met elsewhere – with someone else who is willing and able to consent… or on your own.

Pre-consent is a valid option too, like with a relationship partner. Just as you may have an understanding for physical consent (so you don’t necessarily need to ask for it explicitly each time), you may have a similar arrangement for emotional consent. But someone can still choose to decline. You can still say no to an emotional discussion if you’re not feeling up to it. And you don’t have to permit someone to emotionally vent at you, even in a relationship. You can be emotionally intimate without ever going there, just as you can be physically intimate without ever wanting to explore certain kinks.

Maybe you have friends where you both feel it’s okay to unload on each other emotionally at times. But even in such situations, I think it’s good to check in and make sure it really is consensual. Some people may assume that emotional venting is okay at any time, but is that a valid assumption? Isn’t it wise to check if it really is okay with the other person? What if they’re emotionally tired or just not in a good state to listen? Be very careful about assuming consent if you aren’t certain you have it.

One reason I share this is because I’ve received a lot of emails over the years from people who feel emotionally drained by their relationship partners, friends, family, and co-workers. You may not see just how much venting drains other people who are subjected to it. They may do their best to listen and be compassionate, but that doesn’t mean they actually like it, and many would prefer not to be other people’s emotional teddy bear.

It’s up to you when you do or don’t grant consent for various forms of intimacy. Open your heart when you feel it’s wise to do so. You can make these decisions out of self-interest, generosity, kindness, or based on any other values that resonate with you.

Your body. Your heart. Your mind. Your rules.

Opening Your Heart

What do you consider worthwhile opportunities to share emotional intimacy with someone? When do you feel inclined to open your heart willingly? Pay attention to when these kinds of connections feel good to you.

Also pay attention to when it doesn’t feel good to open your heart. When have you done this and wished afterwards that you hadn’t? Where have you experienced regrets on this path? Have you ever emotionally invested in someone and felt punished for your efforts?

One pattern I’ve noticed is that it doesn’t feel good to open my heart with someone who’s stuck in entitlement, neediness, or a victim mindset. That’s just a massive turnoff, so I don’t open my heart in that direction much. I’ve done too much of that in the past, and it’s super draining. It feels like having my energy vamped away – energy I’d rather invest elsewhere.

On the flip side, I usually love opening my heart with someone when the invitation aligns with curiosity, passion, growth, or shared interests. I like it when someone just wants to explore how we might connect, and they aren’t coming from a place of neediness and attachment. Those kinds of connections are refreshing, and I often find them energizing and inspiring. This is usually how I connect with someone when we have an in-person meetup together, like I’ve been doing for many years (currently on pause due to the pandemic).

Sometimes I also feel good about opening my heart to someone who wants to connect on the basis of sadness, grief, loss, frustration, regret, or even anger. I usually don’t find it draining to engage with these emotions if the person has a growth mindset. It’s also good when a person invites me to engage with them on this level in a way that leaves me feeling okay to decline. Then I feel like I can make a conscious choice.

Just as our physical resources are limited, our emotional resources are too, so it’s wise to manage them carefully, like investments.

As I’ve gotten better at making these consent-based decisions according to my personal boundaries and desires, I feel more emotionally strong and stable. I’ve become more receptive to emotional invitations from people and in directions that feel aligned to me.

Remember to practice good boundary management with your emotional space. You can save yourself a lot of grief by paying attention to your own feelings and honoring those. That’s perhaps the best guide I’ve found on this path. When you’re being emotionally manipulated, you can probably feel the misalignment between the person’s stated intentions and how you’re actually feeling in response.

Receive Steve's new articles by email.

Read related articles: