Shared posts

19 Jan 01:39

Chasing Seymour, Supercomputing And Punchcards

by Martin
Book about the CDC-6600 architecture

Back in November last year I wrote about ‘The Supermen‘, the story of Seymour Cray and the birth of supercomputing and mused about my personal reasons to learn about this particular piece of computing history. Reading about something is one thing, getting hands-on experience with technology from the 1960s quite another. Not possible you say?

Online Access to a Supercomputer of the 1960s

The CDC-6500 at the LCM in Seattle

At first I thought so, too. But then I remembered that I read somewhere about the Living Computer Museum + Lab in Seattle’s and its mission to not only exhibit old computers but to have them up and running and accessible to the public, locally and over the Internet. So I had a look and indeed, they have a working CDC-6500, a machine of the first family of supercomputers and designed by Seymour Cray, which can be accessed over the Internet. All original manuals required to work with the machine are available in PDF format. There’s a PDF document on how to use the Interactive Terminal Facility (IAF) via ssh, a guide to the NOS operating system, programming manuals for Basic and Fortran and, very importantly, the PDF version to ‘Design of a Computer – The Control Data 6600‘ by Bob Thornton written in 1970.  The combination of this material together with online access to the real machine and not only to a simulation creates a wonderful experience if one is willing to spend some time on the subject.

It’s Basic!

SSH Session to the CDC-6500

The first thing I did once I had access to the system was to write a small Basic program and run it. I have to admit that this felt a bit strange. I always thought of Basic being a programming language for small microcomputers of the 1980s and not for supercomputers. But there I was, typing-in a Basic program on the IAF console and running it. Except for minor details, the procedure is identical to programming Basic on a home computer 15 years later. Each line needs a line number that is used to sort the program and to act as a jump destination. Just typing the line number deletes it and correcting or changing code on a line requires typing out the full instruction again. In other words, the same thing on the multi-million dollar computer and a home computer a decade and a half later. Who would have thought that? Not me.

Let’s Do The ‘Real Thing’ – Fortran

A Fortran program on the CDC-6500

But o.k., so lets do something a bit more professional I thought and started to read the Fortran manual. As the 1960s was the domain of punch cards and batch driven computing, the manual was focusing on this method of interaction which made it a bit difficult to get my first Fortran program running as there are some differences between Fortran on punchcards and interactive programming on the terminal. But after an hour or so of trial and error and searching for demo programs from that area online, I finally managed to get my program to compile and run. Later, I discovered that it would not have been necessary to venture out to the Internet, the solution was nearer than I thought as there were some Fortran examples in the IAF manual further down in the text. I might have spotted them earlier if I had the printed version to browse. But reading the manual sequentially on the tablet didn’t make it obvious. An interesting take away, I have to browse more in the future even in electronic versions of a manual.

Hardware Peculiarities

The operating system on a tape

What made the 6500 line of CDC computers different from other systems was that the main CPU was there for number crunching and not for interaction with the user or peripherals. Most of the operating system and interaction with printers and storage media such as tapes and disks was the domain of 10 peripheral processors that had their own memory. Communication with the main CPU was done through main memory that all peripheral processors and the main CPU could access. Don’t bother the number-crunching CPU with benign interaction tasks, let it run and do its job while the peripherals do the rest. Multitasking in hardware if you like. Due to this significant difference of how the machine is set up compared to computers on our desks today, it was very worthwhile to read J. Thornton’s hardware description of the machine.

Files That Are Not Files

Another interesting thing I have learned is the different meaning of technical terms then and now. The term ‘storage’ at the time was not only used for tapes and disks but also for main memory, i.e. what we call ‘RAM’ today. Also, ‘files’ to store programs and data were not quite like what we would refer to as ‘files’ today. On the CDC 6500, a file was more like an abstraction of punch cards with individual records inside that could be sorted.

Evolution of the Console

Working with the computer on the console is like looking back in time as well. On the one hand, it feels familiar to type-in commands on the console and get a response back. But there is no directory structure, files are not really files and everything is still tailored around physical teletypes that would print out commands and responses on endless paper. The notion of a simple backspace key that can be used to delete characters, color, a searchable command line history or let alone a cursor that can be moved up and down is not part of the NOS experience, as these things were not possible with a teletype and only became possible with ‘glass-ttys‘ that abstracted and extended the teletypes.

In summary, I am very grateful for the Living Computer Museum to keep these machines alive and accessible to the public from any place in the world. It’s one thing to read about the beginning of supercomputing and mainframes but quite another to actually interact with one and experience the past directly rather than just by reading about it. I especially want to thank Bruce Sherry of the LCM+L for the time he took to answer all my questions and, of course, for keeping this and other machines alive together with his colleagues!

19 Jan 01:37

New government supported software teaches kids to code in Ojibwe

by MobileSyrup

Students across Canada can now learn coding and computational skills, while also developing their knowledge of the Ojibwe language through a new version of MicroWorlds JR.

The software encourages children from pre-readers to grade 4 to learn how to code without requiring reading comprehension. The program uses simple point-and-click operations, “child-friendly” symbols, and audio directions that are now accessible in Ojibwe.

LCSI, the developers of the MicroWorlds line, and non-profit TakingITGlobal collaborated with Jason Jones, Ojibwe language coordinator at the Rainy River District School Board in Fort Frances, Ontario, to get the most accurate translations. Jones consulted community experts and elders for the program, including his grandmother Nancy Jones, who provided the audio recordings.

MicroWorlds JR is a part of TakingITGlobal and LCSI’s Code to Learn Initiative funded by CanCode. The initiative has reached over 50,000 Canadian students to date.

“This is the first ever coding language available in Ojibwe,” said Michael Quinn, president of LCSI. “This will enable children to reinforce their knowledge of Ojibwe and make them extremely proud of their heritage. In addition, as highlighted as a Call to Action in the Truth and Reconciliation Report, teachers and students will now be able to create culturally appropriate interactive curriculum units.”

MicroWorlds JR also incorporates art from Anishinaabe visual storyteller Nyle Johnston, which was featured in the Connecting with Our First Family exhibit at the Art Gallery of Ontario last year. Coding students will be able to use this artwork, allowing them to create interactive cultural stories and games.

“With 2019 being celebrated as the United Nations Year of Indigenous Languages, efforts are underway to offer additional Indigenous language versions of MicroWorlds JR in the coming months,” said Michael Furdyk, director of innovation at TakingITGlobal.

Download MicroWorlds JR in Ojibwe free of charge for Windows or macOS here.

Image credit: Unsplash.

This story was written by Sera Wong and was originally published on BetaKit.

The post New government supported software teaches kids to code in Ojibwe appeared first on MobileSyrup.

19 Jan 01:37

Apple Publishes Videos Showcasing iPad Pro Workflows

by Ryan Christoffel

Apple shared five new videos on its YouTube channel today, all of which center around working on an iPad Pro. Each video covers a different iPad workflow, as described by the following titles:

  • A new way to host your own podcast
  • A new way to create a presentation
  • A new way to go paperless
  • A new way to take notes
  • A new way to design your space

What I love about these videos, each of which are just over a minute long, is that they demonstrate the actual apps and workflows you can use to accomplish these tasks on the iPad Pro. For example, the podcast hosting video features Anchor for recording, editing, and publishing the podcast, Files for adding audio from an external source, and GoodNotes for holding your speaking notes. The video on taking notes features Notability exclusively, highlighting the app's versatility for handwritten and typed notes, drawings, and audio recordings.

Each of the five videos ends the same way, by stating that the video was filmed, edited, designed, and made entirely with the iPad Pro. I can't imagine a better way Apple could push the message that the iPad is a device for getting real work done.

Support MacStories Directly

Club MacStories offers exclusive access to extra MacStories content, delivered every week; it's also a way to support us directly.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it's made in Italy.

Join Now
19 Jan 01:37

The Tentpoles of Data Science

What makes for a good data scientist? This is a question I asked a long time ago and am still trying to figure out the answer. Seven years ago, I wrote:

I was thinking about the people who I think are really good at data analysis and it occurred to me that they were all people I knew. So I started thinking about people that I don’t know (and there are many) but are equally good at data analysis. This turned out to be much harder than I thought. And I’m sure it’s not because they don’t exist, it’s just because I think good data analysis chops are hard to evaluate from afar using the standard methods by which we evaluate people.

Now that time has passed and I’ve had an opportunity to see what’s going on in the world of data science, what I think about good data scientists, and what seems to make for good data analysis, I have a few more ideas on what makes for a good data scientist. In particular, I think there are broadly five “tentpoles” for a good data scientist. Each tentpole represents a major area of activity that will to some extent be applied in any given data analysis.

When I ask myself the question “What is data science?” I tend to think of the following five components. Data science is

  • the application of design thinking to data problems;
  • the creation and management of workflows for transforming and processing data;
  • the negotiation of human relationships to identify context, allocate resources, and characterize audiences for data analysis products;
  • the application of statistical methods to quantify evidence; and
  • the transformation of data analytic information into coherent narratives and stories

My contention is that if you are a good data scientist, then you are good at all five of the tentpoles of data science. Conversely, if you are good at all five tentpoles, then you’ll likely be a good data scientist.

Design Thinking

Listeners of my podcast know that Hilary Parker and I are fans of design thinking. Having recently spent eight episodes discussing Nigel Cross’s book Design Thinking, it’s clear I think this is a major component of good data analysis.

The main focus here is developing a proper framing of a problem and homing in on the most appropriate question to ask. Many good data scientists are distinguished by their ability to think of a problem in a new way. Figuring out the best way to ask a question requires knowledge and consideration of the audience and what it is they need. I think it’s also important to frame the problem in a way that is personally interesting (if possible) so that you, as the analyst, are encouraged to look at the data analysis as a systems problem. This requires digging into all the details and looking into areas that others who are less interested might overlook. Finally, alternating between divergent and convergent thinking is useful for exploring the problem space via potential solutions (rough sketches), but also synthesizing many ideas and bringing oneself to focus on a specific question.

Another important area that design thinking touches is the solicitation of domain knowledge. Many would argue that having domain knowledge is a key part of developing a good data science solution. But I don’t think being a good data scientist is about having specific knowledge of biology, web site traffic, environmental health, or clothing styles. Rather, if you want to have an impact in any of those areas, it’s important to be able to solicit the relevant information—including domain knowledge—for solving the problem at hand. I don’t have a PhD in environmental health sciences, and my knowledge of that area is not at the level of someone who does. But I believe that over my career, I have solicited the relevant information from experts and have learned the key facts that are needed to conduct data science research in this area.


Over the past 15 years or so, there has been a growing discussion of the importance of good workflows in the data analysis community. At this point, I’d say a critical job of a data scientist is to develop and manage the workflows for a given data problem. Most likely, it is the data scientist who will be in a position to observe how the data flows through a team or across different pieces of software, and so the data scientist will know how best to manage these transitions. If a data science problem is a systems problem, then the workflow indicates how different pieces of the system talk to each other. While the tools of data analytic workflow management are constantly changing, the importance of the idea persists and staying up-to-date with the best tools is a key part of the job.

In the scientific arena the end goal of good workflow management is often reproducibility of the scientific analysis. But good workflow can also be critical for collaboration, team management, and producing good science (as opposed to merely reproducible science). Having a good workflow can also facilitate sharing of data or results, whether it’s with another team at the company or with the public more generally, as in the case of scientific results. Finally, being able to understand and communicate how a given result has been generated through the workflow can be of great importance when problems occur and need to be debugged.

Human Relationships

In previous posts I’ve discussed the importance of context, resources, and audience for producing a successful data analysis. Being able to grasp all of these things typically involves having good relationships with other people, either within a data science team or outside it. In my experience, poor relationships can often lead to poor work.

It’s a rare situation where a data scientist works completely alone, accountable to no one, only presenting to themselves. Usually, resources must be obtained to do the analysis in the first place and the audience (i.e. users, customers, viewers, scientists) must be characterized to understand how a problem should be framed or a question should be asked. All of this will require having relationships with people who can provide the resources or the information that a data scientist needs.

Failures in data analysis can often be traced back to a breakdown in human relationships and in communication between team members. As the Duke Saga showed us, dramatic failures do not occur because someone didn’t know what a p-value was or how to fit a linear regression. In that particular case, knowledgeable people reviewed the analysis, identified exactly all the serious the problems, raised the issues with the right people, and…were ignored. There is no statistical method that I know of that can prevent disaster from occurring under this circumstance. Unfortunately, for outside observers, it’s usually impossible to see this process happening, and so we tend to attribute failures to the parts that we can see.

Statistical Methods

Applying statistical methods is obviously essential to the job of a data scientist. In particular, knowing what methods are most appropriate for different situations and different kinds of data, and which methods are best-suited to answer different kinds of questions. Proper application of statistical methods is clearly important to doing good data analysis, but it’s also important for data scientists to know what methods can be reasonably applied given the constraint on resources. If an analysis must be done by tomorrow, one cannot apply a method that requires two days to complete. However, if the method that requires two days is the only appropriate method, then additional time or resources must be negotiated (thus necessitating good relationships with others).

I don’t think much more needs to be said here as I think most assume that knowledge of statistical methods is critical to being a good data scientist. That said, one important aspect that falls into this category is the implementation of statistical methods, which can be more or less complex depending on the size of the data. Sophisticated computational algorithms and methods may need to be applied or developed from scratch if a problem is too big to work on off-the-shelf software. In such cases, a good data scientist will need to know how to implement these methods so that the problem can be solved. While it is sometimes necessary to collaborate with an expert in this area who can implement a complex algorithm, this creates a new layer of communication and another relationship that must be properly managed.

Narratives and Stories

Even the simplest of analyses can produce an overwhelming amount of results and being able to distill that information into a coherent narrative or story is critical to the success of an analysis. If a great analysis is done, but no one can understand it, did it really happen? Narratives and stories serve as dimension reduction for results and allow an audience to navigate a specified path through the sea of information.

Data scientists have to prioritize what is important and what is not and present things that are relevant to the audience. Part of building a good narrative is choosing the right presentation materials to tell the story, whether they be plots, tables, charts, or text. There is rarely an optimal choice that serves all situations because what works best will be highly audience- and context-dependent. Data scientists need to be able to “read the room”, so to speak, and make the appropriate choices. Many times, when I’ve seen critiques of data analyses, it’s not the analysis that is being criticized but rather the choice of narrative. If the data scientist chooses to emphasize one aspect but the audience thinks another aspect is more important, the analysis will seem “wrong” even though the application of the methods to the data is correct.

A hallmark of good communication about a data analysis is providing a way for the audience to reason about the data and to understand how the data are tied to the result. This is a data analysis after all, and we should be able to see for ourselves how the data inform the conclusion. As an audience member in this situation, I’m not as interested in just trusting the presenter and their conclusions.

Describing a Good Data Scientist

When thinking of some of the best data scientists I’ve known over the years, I think they are all good at the five tentpoles I’ve described above. However, what about the converse? If you met someone who demonstrated that they were good at these five tentpoles, would you think they were a good data scientist? I think the answer is yes, and to get a sense of this, one need look no further than a typical job advertisement for a data science position.

I recently saw this job ad from my Johns Hopkins colleague Elana Fertig. She works in the area of computational biology and her work involves analyzing large quantities of data to draw connections between people’s genes and cancer (if I may make a gross oversimplification). She is looking for a postdoctoral fellow to join her lab and the requirements listed for the position are typical of many ads of this type:

  • PhD in computational biology, biostatistics, biomedical engineering, applied mathematics, or a related field.
  • Proficiency in programming with R/Bioconductor and/or python for genomics analysis.
  • Experience with high-performance computing clusters and LINUX scripting.
  • Techniques for reproducible research and version control, including but not limited to experience generating knitr reports, GitHub repositories, and R package development.
  • Problem-solving skills and independence.
  • The ability to work as part of a multidisciplinary team.
  • Excellent written and verbal communication skills.

This is a job where complex statistical methods will be applied to large biological datasets. As a result, knowledge of the methods or the biology will be useful, and knowing how to implement these methods on a large scale (i.e. via cluster computing) will be important. Knowing techniques for reproducible research requires knowledge of the proper workflows and how to manage them throughout an analysis. Problem-solving skills is practically synonymous with design thinking; working as part of a multidisciplinary team requires negotiating human relationships; and developing narratives and stories requires excellent written and verbal communication skills.


A good data scientist can be hard to find, and part of the reason is because being a good data scientist requires mastering skills in a wide range of areas. However, these five tentpoles are not haphazardly chosen; rather they reflect the interwoven set of skills that are needed to solve complex data problems. Focusing on being good at these five tentpoles means sacrificing time spent studying other things. To the extent that we can coalesce around the idea of convincing people to do exactly that, data science will become a distinct field with its own identity and vision.

18 Jan 23:44

Purism announces PureOS Store

by Sriram Ramkrishna

Purism is pleased to announce PureOS Store, a secure alternative to proprietary app stores that respects your privacy and freedom. While there is much to do before we go live, we are well into building the infrastructure and refining the policy. PureOS Store will be a vibrant hub for both mobile and desktop apps.

We envision PureOS Store as the primary community interface for app developers to contribute to the wider ecosystem, without having to understand the underlying technology like packaging or the mechanism of pushing apps upstream. We want to incentivize developers to create software that meets community values with the ultimate goal of incorporation into PureOS itself.

Apps in the PureOS Store will be evaluated on an ascending scale, with maturity ranging from development and beta levels to a fully-endorsed and default app in PureOS. Correlating “badges” will be utilized to quickly and clearly display the status of a given app, while also reflecting on the software’s freedom, privacy, security, and ethical design.

PureOS Store Badges
PureOS Store Badge Concept

This will not only facilitate trust in the ecosystem, but will empower users to make informed choices about apps before installing them.

In this way, anyone using Librem hardware or PureOS can understand that the software they rely upon is developed, packaged, and distributed in a socially-conscious manner. The values of PureOS Store reflect the values at Purism and, in fact, those we all would like to see in society.

In addition, we believe PureOS Store can facilitate the distribution of software across Librem devices that run PureOS, from laptops to mobile. We hope to develop real convergence — apps that “just work” regardless of which device they’re running on. Distributions like Debian that run on multiple silicon architectures have always held the promise of convergence, but most efforts at achieving this goal have fallen short. Purism is in the unique position of producing a Free and Open-Source Software (FOSS) laptop and smartphone, enabling an entire ecosystem based upon PureOS and our social values.

Stay tuned for more announcements as we get closer to launch.

18 Jan 23:44

Why Data Is Never Raw

Nick Barrowman, The New Atlantis, Jan 18, 2019

There's so much emphasis on things like evidence-based decision-making, or data-driven learning, as though the proponents had never read Kuhn, Lakatos and Laudan. It bears repeating, especially in these days of big data, that "data is never simply given, nor should it be accepted on faith. How data are construed, recorded, and collected is the result of human decisions — decisions about what exactly to measure, when and where to do so, and by what methods." Theory informs data in two key ways: first, the terms we use to describe the world themselves suggest that the world is one way, and not another; and second, what we already believe about the world informs what we believe we see or perceive in the world.

Web: [Direct Link] [This Post]
18 Jan 23:44

Community Data Is What People Think It Is

by Richard Millington

(reminder prices for our Strategic Community Management course rise tonight).

It’s a mistake to believe that you win the argument if you have the right data. During this week’s webinar, I presented this slide and asked attendees whether the level of traffic was going up, down, or staying the same.

The audience was split between the three.

We’re all looking at exactly the same graph and coming to completely different interpretations.

That’s pretty incredible don’t you think?

But this is true of most data. Do you look for the line of best fit, the trend in the past few months/years, or some sort of average over them?

You can see this in the images here:

This happens with almost every possible metric. Everyone brings their own biases into their interpretation of data. If they don’t know you, don’t believe in you, or don’t understand the community then the same metrics you’re delighted to present can be perceived negatively.

You might be delighted to show the community solved 20% more customer questions than last year while the head of customer support might note it’s still only 5% of the total questions the support team receives. Why bother?

Data is one of many signals that helps people understand how the community is doing/what it’s worth. It isn’t the strongest or weakest. Others include the relationships you’ve developed with senior people, whether you’re delivering impact for them, what the narrative of the community is, who else supports the community, whether they’re engaged in the process of community etc. etc…

It’s better to have great data than not, but don’t imagine it will be a silver bullet solution to get the support you want.

18 Jan 23:44

HCL is getting the band together

by Volker Weber


Ever since the curious 'partnership' between IBM and HCL around the IBM Collaboration Solutions' products — HCL would develop and service Notes & Domino and IBM would keep sales and marketing (!) — I wondered how this would play out. Towards the end of the year the companies announced that HCL would buy out IBM and be responsible for Notes & Domino, Connections, Sametime, Portal and what not. Product Management, Marketing, Sales, Development, Service, everything. Watson Workspace was not part of the deal and is being discontinued.

And there was one particular person I was interested in hearing about. No, not Ed Brill, who has moved on to Press Ganey for a new career. I was thinking about Uffe Sorensen, currently the Sales Director at IBM for the collaboration platform. I have known Uffe "forever" and clearly, this job would no longer be at IBM. Would he hold on to a different position or just retire. He has been at IBM for more than 48 years and he was going to see how IBM can celebrate a 50 year veteran. (→ There is no provision beyond 40 years service.)

As it turns out, Uffe isn't ready to retire. And he is not going to stick it out with IBM. Uffe is joining the band at HCL:

(Reference to the Hitchhiker's Guide to the Galaxy omitted, vowe)... I thrive in a particular environment - and seek that out. I started with IBM in July 1970 (SIC) working with APL/360: an ingenious interactive environment to help business professionals express their business problems and gain IT support to resolve - by yourself. The very first "citizen developer" environment - using IBM 1052 and IBM 2741 typewriter terminals ;o)

But we did wonders - didn't know this would eventually be known as "analytics", "big data", "citizen developers", "portals", what have you.

I'll stay close: I have joined HCL and will continue the #dominoforever voyage on a new vessel under Richard Jefts' lead. I remain committed to helping you and everyone taking advantage of this unique technology and to support the future development and growth of the Domino family as HCL offerings.

I have witnessed the huge change in velocity since IBM and HCL announced the IP partnership in 2017 through the shipment of Domino V10 in 2018.

To me, this is the way to go - and I trust you'll stay with us on this journey.

If you have any doubts, I would love to dispel your concerns... And ensure that HCL exceeds your expectations delivering a high Return On your Investments !

I have lots of good things to say about Uffe, but I will let him speak for himself here. I am looking forward to being in touch in the future.

Next stop: HCL is bringing Episode 2 of The Factory Tour to Milan at the end of February. Who knows who is going to show up there?

18 Jan 23:44

Facebook ist nicht gesund für unsere Freundschaft

by Carsten Thomas
Zerstören digitale Freundschaften in Facebook echte Freundschaften?Micky Beisenherz im Dialog mit Johannes Ceh über Wutpostings und deren Auswirkungen auf unser Zusammenleben. „Lieber Andi*. Ich mag dich. Damit das so bleibt, werde ich dich jetzt bei Facebook löschen.“ Was drastisch klingt, war [...]
18 Jan 23:44

Twitter Favorites: [Dries] The new Slack logo features four little Druplicons. ;-)

Dries Buytaert @Dries
The new Slack logo features four little Druplicons. ;-)
18 Jan 23:44

Twitter Favorites: [gruber] The original plaid design in tech: @jkottke’s late great Stellar. (Predated Slack by at least two years.)…

John Gruber @gruber
The original plaid design in tech: @jkottke’s late great Stellar. (Predated Slack by at least two years.)……
18 Jan 23:44

Twitter Favorites: [JodiesJumpsuit] Toronto for the foreseeable future. #dlws

jump for my love @JodiesJumpsuit
Toronto for the foreseeable future. #dlws
18 Jan 23:43

Teams, communities and networks: a case of our homeschooling group

by Lilia

We had a meeting in our homeschooling group yesterday to discuss what do and how do we actually want to do together after a period of “let’s see where things go by themselves”. Over last few months I almost gave up on investing in our local homeschooling community. What we had before was broken with lots of pain (and learning that comes with it 🙂 and there we live there is no critical mass of like-minded homeschoolers with the kids/rhytms/values that match to make it work easily. Still there is a need in others, a lot of social capital built in the previos period and willingness to make it work. Some requirements are clear, others have to be sorted out our further and we still have to come up with a way to make it work.

The challenge that we are facing now is about creating a network of families with enough autonomy to go on with their individual educational trajectories and social life, while at the same time with enough overlap to do projects in different constillation of members on a regular basis. As soon as it became clear I had to dig out the work I did on distributed Agile teams and in particularly this picture from my post on teams, communities and networks in terms of communication forms:

Core communication types for teams, communities and networks All this time I thought of our homeschooling group in terms of a community. This is how I describe it in the post of how it worked and what went wrong –
A sense of community. At the same time the operational structure was more of a network, without a single core defining the norms. We had several active nodes, “pulsating” with events and attracting different overlapping groups of participants from much broader “cloud”. As long as those were in balance and communication between them went well, there was enough shared understanding and network density to give it a feel of a community.

Communication and common groundWhat happened next is at some level pretty similar to the challenges of distributed Agile teams (right): relationships and the shared view of what, why and how we do together started to fall into pieces.

Another interesting thing that happened is that from something that felt as a community it evolved into two directions. A dense group of people around one node went off into a “team mode” with closed boundaries, strong ties and structures that enabled getting more things done. The rest of people fell into the “network” end of the diagonal, with lack of the coherence to make things happen.

All of which is very interesting to dive into, but right now I have to run across the border with the kids, so just two things so far:

  • I need to find more on working cases, practices, organisational structures in the networks that are capable of having projects and getting things done on a regular basis.
  • I have to dig out what was happening with conceptual understanding of the things in respect to communication and sort of ties in relation to networks, communities and teams while I was not actively present in the field. I’ve seen that Harold Jarche went on building some stuff inspired by the model above, but have to dig it out.

Any thoughts and pointers are very welcome.

The post Teams, communities and networks: a case of our homeschooling group appeared first on Mathemagenic.

18 Jan 23:43

LEGO in the Lab

Charles Rapple, The Scholarly Kitchen, Jan 18, 2019

This article doesn't have much article to it, but there is a set of links pointing to interesting examples of how Lego is used in the lab and in the classroom (I wish I could see more than just the abstract for the Lego-Github presentation). I have to confess, though - when I was a kid there were two types of people in the world, Lego people, and Meccano people, and I was a Meccano person. Related: Pinterest Lego in the Classroom.

Web: [Direct Link] [This Post]
18 Jan 23:43

Custom Charts – RallyDataJunkie Stage Table, Part 1

by Tony Hirst

Over the last few evenings, I’ve been tinkering a bit more with my stage table report for displaying stage split times, using the Dakar Rally 2019 timing data as a motivator for it; this is a useful data set to try the table out with not because the Dakar stages are long, with multiple waypoints (that is, splits) along each stage.

There are still a few columns I want to add to the table, but for now, here’s a summary of how to start reading the table.

Here’s stage 3, rebased on Sebastien Loeb; the table is ordered according to stage rank:

The first part of the chart has the Road Position (that is, stage start order) using a scaled palette so that out of start order drivers in the ranking are highlighted. The name of the Crew and vehicle Brand follow, and a small inline step chart that shows the evolution of the Waypoint Rank of each crew (that is, their rank in terms of stage time to that point, at each waypoint). The upper grey bar shows podium ranks 1 to 3, the lower grey line is tenth. If a waypoint returns an NA time, we get a break in the line.

Much of the rest of the chart relies on “rebased” times. So what do I mean by “rebased”?

One of the things the original data gives us the stage time it took each driver to get to each way point.

For example, it took Loeb 18 minutes dead to get to waypoint 1, and Peterhansel 17m 58. Rebasing this relative to Loeb suggests Loeb lost 2s to Perterhansel on that split. On the other hand, Coronel took 22:50, so Loeb gained 290s.

Rebasing times relative to a particular driver finds the time difference (delta) between that driver and all the other drivers at that timing point. The rebased times show for each driver other than the target driver are thus the deltas between their times and the time recorded for the target driver. The rebased time display was developed to be primarily useful to the driver with reference to who the rebased times are calculated.

So what’s going on in the other columns? Let’s rebase relative to Loeb.

Here’s what it looks like, again;

The left hand  middle of the table/chart shows time taking in making progress between waypoints.

To start with we have the Stage Gap of each driver relative to Loeb. This is intended to be read from the target driver’s perspective, so where a driver made time over the target driver, we colour it red to show our target lost time relative to that driver. If a driver was slower than the target driver (the target made up time), we colour it green.

The Stage Gap is incremental, based on differences between drivers of based on the total time in stage at each waypoint. In the above case, Loeb was losing out slightly to the first two drivers at the first couple of waypoint, but was ahead of the third place driver. Then something went bad and a larget amount of time was lost.

But how much time? That what the inline bar chart cells show: the time gained / dropped going from one waypoint to the next. The D0_ times capture differences in the time taken going from one split/waypoint to the next. The horizontal bar chart x-axis limits are set on a per column basis, so you need to look at the numbers get a size of how much time gained/lost they represent. The numbers are time deltas in seconds. I ummed and ahhed about the sign of these. At the moment, a positive time means the target (Loeb) was that much time slower (extra, plus) than the driver indicated by the row.

Finally, the Pos column is rank position at the end of the stage.

If we look down the table, around Loeb, we see how Loeb’s times compare to the drivers who finished just ahead —and behind— hi. For drivers ahead in the ranking, their Stage Gap will end up red at the end of the stage, for drivers behind, it’ll be green (look closely!)

Scanning the D0_ bars within a column, it’s obvious which bits of the stage Loeb made, and dropped, time.

The right hand side of the figure considers the stage evolution as a whole.

The Gap to Leader column shows how much time each driver was behind the stage leader at each waypoint (that is, at each waypoint, rank the drivers to see who was quickest getting to that point).

Along with the Waypoint Rank, the Road Position and Gap to Leader, this is the only aspect of the table that is relative to the driver associated with that row: it helps our target (Loeb) put each other driver’s performance on the stage in the context of the overall stage rankings. The dot marker indicates the gap to leader at the end of the stage.

The 0N_ columns show the time delta on stage between each driver and Loeb, which is the say, the delta between the accumulated stage time for each driver at each waypoint. The final column records the amount of time, in seconds, gained or lost by Loeb relative to each driver in the final stage ranking (penalties excepted).

Looking at the table aound Loeb we see the column entries are empty except for the Gap to Leader evolution.

The original version of this chart, which I was working up around WRC 2018, also includes a couple more columns relating to overall rally position at the start and end of the stage. Adding those is part of my weekend playtime homework!

18 Jan 23:41

"Even though the average American corporation resembles a giant Skinner box with a parking lot, no..."

“Even though the average American corporation resembles a giant Skinner box with a parking...
18 Jan 23:41

The Best Work I Saw at the Medium Festival of Photography: Part 2

by Jonathan Blaustein


I never set out to be an opinion columnist.

It’s true.

Hell, before 2008, if you’d told me I’d become a professional blogger, much less do the job for nearly 9 years, I’d have taken you for a crazy person.

But everything realigned 10 years ago, in the eye-teeth of The Great Recession, and frankly, I don’t think the world has been the same since.

It’s funny, reading the papers, following the discussions about whether the 10 year bull market has finally turned bear.

Will the stock market’s tumble, or the government shut down, or Trump’s stunning incompetence, finally derail the strong American economy, and lead to a recession?

I find those articles patently absurd, and my guess is, you do too.

I’m glad the stock market has gone its run, sure, but in every other way, it feels like America is still not back to where it was before the mix of horrible home loans, and the toxic derivative instruments built upon them, created a financial bubble that finally burst in September 2008.

By January 2009, of course, the economy was in pure free-fall, and America inaugurated its first African-American President, tasked with putting the pieces back together. (Tough luck, Barack. You needed the crisis to get elected, I’d imagine, but it meant you spent your best years putting out another man’s fires.)

I admit, knowing it was exactly 10 years ago has been on my mind lately. I first approached Rob Haggart, my long-time editor, because he put out a call looking for Great Recession images in early 2010.

(He complimented the ones I emailed him, pictures from Southern Colorado I’ve mostly scrubbed from the internet, which I’m now re-visiting nearly a decade later.)

Since we were corresponding anyway, I pitched Rob on writing a couple of articles for him, gratis, as I was a fan of the blog, and had been writing on a small-time blog with friends for nearly a year by then.

At that point, when I wrote him, it was spring 2010, and my small commercial photography/printing studio in Taos had seen its business evaporate. I mean, I went from having clients to having none, all within a few months.

The tell-tale sign, I discovered, was I was getting hired a lot, near the end, to do Canadian passport photos, because all the Canadians wanted to make sure they could get the hell out of the country.


Going into the Great Recession, I was an unknown artist doing all sorts of photo and printing services to make a living, while also running the studio as a gallery. (I sold next to nothing.)

Afterwards, I was a somewhat-known artist, a professional blogger, and a college professor.

But all these years later, I’m just about making what I made before the career-changes happened.

Truth be told, I love the career exchange, and would make it every time, if I could. I get a lot of satisfaction and pleasure out of the work I do, despite the grind of permanent freelance living.

My wife makes more money now, as she went into private practice as a therapist, (after years of working in a local school,) so that helps for sure.

As I’m said, I’m personally very happy, but in no way do I think that things are “better” in the world than they were before the Crash, and in many ways they seem worse.

Seeing all the income growth go to such a small percentage of Americans wears away social trust, as once people believe a game is rigged, they have much less interest in maintaining said system.

And of course while Obama was left to clean up W. Bush’s mess, the real legacy of The Great Recession was Donald J. Trump.

I’ve been a vocal critic of the now-President here for years, and even I’m stunned to read that the FBI actively investigated whether Trump might be a Russian asset.

(And that he bought a room full of McDonalds and Wendy’s for the Clemson football team.)

This truly unstable world, I believe, was first born in the ashes of the Global Economic Collapse.

All of a sudden, America stumbled.

Even worse than in Vietnam.

The extreme elements in our Capitalistic system wiped out extraordinary amounts of wealth, for ordinary people, and in many cases literally kicked them to the curb.

In the end, essentially no bankers went to jail.

Foreclosed Americans were left to pick up their own pieces, and American taxpayers paid the bill for bailouts.

Are we really surprised that so many people, doing so poorly in depressed areas, would fall for Trump’s con, feeling their pain and promising to bring their jobs back?

Or that other major nations, like China and Russia, would see our inherent weakness, and push that much harder to take our mantle of power, geo-politically?

I haven’t written a political column in a while, because I try to balance the style and tenor of these articles. It’s one way that I’ve managed to keep it interesting, given that the format is essentially unchanged all these years.

But as it’s early in 2019, and 10 years since that evil 2009, I felt it was a good time to go in this direction.

This story will ultimately be about the second batch of photographers I saw at the Medium Festival of Photography in San Diego last October.

And last week, I wrote my spiel about the city, and gave you all some advice to get out there and hit up the festivals, or travel more this year.

This column is meant to build upon that, if you can believe it.

Because beneath the super-structure of the political critique, (I can’t believe I’m explaining my own meta-level writing,) what I really meant to say was: reinvention is painful.

Change is hard.
And yet it’s always worth it.

One of the cardinal rules of being an artist is that once you realize how deeply you’re embedded in your comfort zone, it’s time to jump out of bed.

Doing these things is much harder than saying them, and pretty much no one chooses to change.

It’s normally forced upon us by life circumstances.

But knowing that you eventually have to shake things up, and then having the guts to make the tough call, these processes lead to growth, as a human and an artist.

I live by my own advice, I swear.

Just the other week, I gave up my beloved Wing Chun Kung Fu, and switched to Aikido, because I knew I needed a new teacher, and a new beginning.

It hurt, but I did it anyway. Because that’s how I was trained at Pratt.

Many of the artists I meet at events like Medium don’t have the MFA degree. They didn’t go to art school, and some haven’t even taken a formal class.

Many of the photographers had a first career. They didn’t follow their passion, initially, but when given the chance later in life, they took workshops, joined critiquing groups, and threw everything they had at their new career as an artist.

Other times, I let my opinions fly, and I might be sitting across from an MFA photographer. Or even better, sometimes, I’ll be critiquing a professor from a really established school.

This visit, a photographer came up to me to re-introduce herself, as I’d been really strong in my advice, during a previous review at Medium. (I insisted that she change her paper type from matte to a photo surface.)

I published her work here, and never thought about it again. But apparently, the woman told me, I’d gotten under her skin, as she resented the advice at first, but then had finally done what I suggested, and found success with the change.

Another person verified that this professor had told the story many times, as I was the “paper guy,” and it had been a big deal in her life.

Honestly, I can’t keep giving beginning-of-the-year-advice-columns much longer. February is right around the corner, and anyway, after today, it will be enough.

The best I can say to you is to try to embrace some change, in 2019, and push yourself hard.

Try a different medium. Go somewhere new. Sign up for a class at a local community college. Switch to black and white. Make a video.

Times of upheaval have a way of re-writing the rules of the game, and why not make yourself stronger, and pick up some new skills, for the decade to come?

Enough said, now we’ll look at the second batch of the Best Work I Saw at the Medium Festival of Photography in October 2018. (As always, they’re in no particular order.)

Victoria Fava was visiting from Monterrey, Mexico. She studied art as well as photography, and we spent much of our chat discussing what the optimal medium would be to express her ideas.

She’s been interested in the fact that astroturf, a chemical product developed by Monsanto, is highly utilized there, and oddly is often featured in wealthy homes. (From an American perspective, it seems downscale.)

I like the photos, but personally thought creating installations, making mock-outdoor-scenes indoors, might be the way to go. (Easy for me to say. That’s much harder to pull off than making a photograph.)

CJ Pressma is one of the types of people I alluded to above, as he’s been involved with photography at a high level since before I was born. CJ was visiting from Louisville, where he ran a residency program for many years.

He’s was also a master printer, doing portfolios for people like Meatyard, and my colleague Brian Clamp even mentioned to me during the festival that he had vintage prints that CJ had made back in the day.

At Medium, CJ showed me a book he’d made pairing (mostly) night photographs with faux dream diary statements he’d asked his friends to contribute. The one image of the frozen truck was probably the best single image I saw that week.

Bil Zelman is one of the few people in the world who make me jealous, as he lives in Encinitas, my favorite beach town in California. (Though all of North County is pretty cool, IMO.)

He’s primarily a commercial and editorial photography who self-financed a personal project looking at elements of the landscape that reflect our anthropocentric times. (Non-Native species, non-native trees, etc.)

Given the high flash at night, they’re super dynamic. And I had to lay it on hard to convince Bil that he shouldn’t lead with 15 tree pictures before showing the alligators and Burmese python.

Never bury the lede!

But Bil told me he mixed it up for later reviews, and received some really great responses.

Justin Nolan is another example of one of the types I mentioned above. He’s a professor at the University of Central Florida in Daytona, and he got his MFA at UNM in New Mexico not too long ago.

Once I knew his training, I pushed him pretty hard, and asked some difficult questions. I never would have gone down that interrogative rabbit hole, though, with someone who was new to the field, or hadn’t been trained in the critique process.

Needless to say, I didn’t love one of his projects, but found his take on Florida, his new home, to be witty and great. I make fun of Florida a lot on Twitter, (as does anyone paying attention to what happens down there,) but I liked that Justin’s subtle style contrasted with that over-the-top reputation.

Finally, we have Sheri Lynn Behr, whom I met at Photo NOLA back in 2012. (See what I mean about going to festivals. You can stay in touch with so many people.)

Sheri mentioned to me, in the hall before the review, that she’d heard I was tough, and that she wanted a tough critique. I knew her work was doing well, as she’d just had a solo show at the Griffin Museum in Massachusetts.

Sure enough, though, she showed me a bunch of projects that were mixed together, and printed on different paper surfaces. It was one of those crits where she had an answer for most of my issues, and was fairly wedded to her process, so I let it drop.

Her meta-project, which she made into a book, is called “BeSeeingYou,” and is all about surveillance culture. This one vertical piece stood out to me so powerfully that I’m going to show it by itself.

That’s it for today, and we’ll be back to the book reviews next week. I am planning to hit up a few festivals in 2019 though, including Photo Lucida in Portland, which will be my first time.

So I’ll be sure to report from the field again as soon as I’m able.


Visit our sponsor Photo Folio, providing websites to professional photographers for over 10 years. Featuring the only customizable template in the world.


18 Jan 23:41

Reducetarian, not vegetarian

by Caterina Fake

I like a good steak. I really, really like a good steak. I order it rare. Other members of my family order it blue. And my grandfather used to eat his almost raw, instructing the griller, “just restore the body heat”, which, let’s be honest, is a really gross way to order food. That’s the kind of carnivores I come from. Tartare? Yes. Oh yes.

And I love animals, really really really. Animal rights? Makes me sick. My first exposure to the ghastliness of the industrial meat issue was Sue Coe’s terrifying book Dead Meat, which I read in 1996. More recently we watched the documentary Food, Inc. with the kids, probably the first horror movie they’ve seen. As the years have passed we’ve become more and more aware of the terrible things required to produce industrially farmed meat–through movies, articles, books the rise of various organizations promoting animal rights, even The Smiths album Meat is Murder.

As investors, we’ve looked at– and sampled–a lot of alternative protein products: classics like Tofurkey, Garden burgers and Boca Burgers. Second wave meat alternatives like Soylent, Beyond Meat, and we always get an Impossible Burger at Gott’s at the airport before boarding flights. We’ve eaten crickets, witchetty grubs, a vast array of soy products pretending to be meat, fake meat comprised of mushrooms and beets, and bland, frightening and generally unidentified frankenfood.

I am a failed vegetarian. My efforts to eliminate meat from my diet made me realize how anemic I was: I wasn’t good about taking my vitamins or making sure I had a good source of iron. I fainted several times, and ended my stint as a vegetarian when I entered a kind of fugue state and found myself sitting at the counter of Jackson Hole Burgers eating a 7 oz. burger. That is not a small amount of meat. But what’s a woke carnivore to do?

Reduce. Our kids call themselves not vegetarians–they still like the occasional slice of bacon–but reducetarians. Say it out loud: it sounds better than it looks. And it makes sense doesn’t it? I remember the short TED talk by Graham Hill in which is proposed to be a “weekday vegetarian” which is along reducetarian lines. Just eat less.

This is the future of food. Millennials are all on board, and leading the charge. 70% of the world population reportedly is either reducing meat consumption or leaving meat off the table altogether, according to market research from GlobalData, who works with 4000 consumer brands.

I really struggle with this, I’m a true carnivore. Some people have told me it’s my blood type, and maybe I need to take my vitamins. But it’s getting easier and easier for us woke carnivores to

like we’ve wanted to.

18 Jan 23:41

Collecting Whole Earth

by (Peter Rukavina)

Patrick Rhone:

When passing by a used bookstore, there are certain things that I must buy if I see them. I simply can’t help myself, no matter the price. This is one of those things.

“One of those things,” in this case, is copies of the Whole Earth Catalog.

I suffer from the same condition:

Photo of my collection of Whole Earth Catalogs, on my bookshelf.

18 Jan 23:41

New Issues for OERs

by Stephen Downes
This is based on my contribution to the new Creative Commons 'Education Platform' discussion of issues related to open educational resources (OERs). This is the next step following the development of a set of principles for the Platform.

(I’m not sure of the origin of the set of suggestions above (in the original document, and here discussed below), but I am adding an extra set below based on issues created by next-generation technology ( - because it seems to be the above issues are looking at the old problems (where OER is a kind of publishing) - Stephen)
  • Models and practices for the development and distribution of data-driven resources - issues here include open data, data API standards, OERs from technology such as Jupyter Notebooks - how do OERs, data and software interoperate? 
  • OERs and cloud technology (eg. is a resource in AWS or Google Cloud ‘open’, especially if it depends on cloud functionality), cloud-based open education systems (eg CodePen) - is the ‘hosting’ of an OER also required to be open? 
  • OER-graph methodology, eg., systems and methods for linking resources, version control, etc. Is a resource in GitHub an OER? GitLab? Does ‘attribution’ require hashed reference to the original? Is the provenance of a resource part of the resource? 
  • The relation between OERs and personal data - eg. is it a violation of the presumptions of OER for a provider to create a profile or identity graph of people using OERs? There are also questions (some raised above) about the role of OER in creating identity and therefore OER-colonialism 
  • Post-HTTP resource distribution, eg., the use of content-addressable networks such as Interplanetary File System (IPFS) - is licensing relevant when certain forms of publication are permanently open? 
  • The role of OER in recognition systems (for production, use, etc) and the relation between OER and credentials (especially closed or proprietary credentials), and similarly, the relation between OERs and competencies (esp. Closed proprietary content standards) 
  • The nature of truth in OERs - how is truth created and/or established in OERs (peer review? Consensus mechanisms?) and how does the community respond to the equivalent of fake news in OERs produced by advertisers and bad actors? How is agreement in the OER community reach (including but not limited to agreement about this set of issues?) 
  • The difference between previous iterations of learning technology, which are authority-based and centralized, and that which we are experiencing with modern web technologies where learning is based on creative activities that are distributed and democratized. 
  • The relation between OER and wider social-economic-political issues - not just inclusivity and access (mentioned above) but the role of OERs in promoting individual security, identity, voice, opportunity - is it an educational resource if it doesn’t advance the learner?

Unattributed suggestions from the 'email thread', organized into categories by me (with my commentary  following):

       Funding, Support, Development - Faculty

  • Funding faculty development efforts around OER, including adoption/adaptation/creation/OEP (funds for course releases, peer reviewing, etc.)
  • Encouraging faculty use and creation of OER by changing promotion/tenure policies so that faculty “get credit” for OER
  • Better support for faculty interested in practicing open pedagogy (e.g., expanding resources like Open Pedagogy Notebook)  
  • Scholarly Activity: the “coin of the realm” for faculty in many disciplines is book/textbook publishing, which is a refereed process. Open publishing is not respected for promotion and tenure. We need more refereed open presses that are endorsed by disciplinary associations.
       I have a hard time supporting funding for the richest people in the education ecosystem. I think that support for OER will have to come from the community and especially from learners, and that those faculty who are supportive will be helpful, and those who are not will not likely be enticed by lobbying or even payment. I would note, in fact, that in the realm of open access publication generally it has been necessary to create open access mandates rather than support or incentives. The issue here is: should we encourage OER mandates for educational materials, and if so, how? Meanwhile, if there are cases where funding would help, we should be asking how funding can support those who would most benefit from it, not how it can be added to the wealth of those who are already well-funded.

       Funding, Support, Development - Other
  • Creation of OER by non-faculty scholars: where are these folks? How to reward them?
  • Better way to coordinate creation of new OER on a national/international scale, so different groups aren’t unknowingly working on similar projects. How can we as a community keep track of needed OER and prioritize what should be created next?
  • What models in other open communities (wikipedia as one example) can be used/recreated/modified to support individual contributions to continuously improved central resources?
    Making use of growing public-domain material (literature, especially) 
       I  think the OER movement in general needs to become more aware of the large body of learning resources that exist outside traditional OER circles, and thing about creating and defending ways using these, including such things as fair use/fair dealing. I don't think it's necessary to fund these people so much as it is necessary to give them space and encouragement. The issue is: how can we (the academic community) support others in the development of OER?


  • An easier way to edit and customize OER with a free tool.
  • Open platforms, analytics, algorithms and adaptive tools to support open learning content.
  • Improving discoverability of OER Standardizing OER metadata (e.g, adding a field to MARC records to indicate whether something is OER?) 
  • Uniform rating system for OER
  • Better versioning control
  • Is a central repository for higher-ed OER a realistic option? (Faculty say they want one.) Who would manage such a thing? Where would ongoing funding come from?
  • Continuous Improvement” = “continuous change” which many faculty may not be comfortable with. “This section was in Chapter 4 last week. Why is it now in Chapter 5?? I can’t teach like this!” Perhaps on-the-fly online corrections of small errata (e.g. typos), with periodic “editions” that address bigger-picture feedback?
        I think that these are good suggestions in general. But I think the way we are stating the issue should be flipped around - instead of asking "how can we get other people to give us tools" we should be asking "how can we provide tools for other people (and especially those who are disadvantaged, from unrepresented cultures, or those who could not afford tools otherwise)

       Nature and/or Type of OER
  • Supporting the development of diverse, inclusive OER and determining methods for increasing the inclusivity of existing OER
    How can OER use and adoption be used to drive broader improvement of open materials and ongoing advances in understanding of how humans learn? What are necessary infrastructure components to realize OER potential in iterative improvement and learning science advances?
  • How can open models be brought to bear to advance and improve advanced technologies in ways that make these approaches more broadly accessible, affordable and that identify (and remove) bias? I’m especially thinking here of learning analytics.
  • Keeping accessibility for persons with disabilities in the forefront when creating/adapting/tagging materials.  
  • Creating a robust and rigorous set of open assessments to use with K-12 OER instructional materials
  • Expanding availability of free or low-cost ancillary materials (e.g., PowerPoint slides for instructors, test banks, courseware) to compete with commercial publishers’ “course in a box.”
        These items in general state requirements for OER - some specify the type of material (assignments, presentations, etc) and others desirable qualities in the material (diversity, inclusion, accessibility). In reality, though, when people are creating free resources, you can't come along and tell them what to create (and your statements about what ought to be created will be taken as pedantic and preachy. Diversity and inclusiveness are good, but the only way to ensure diverse content is to ensure diversity of origin, which means opening up the development of OERs beyond teachers and publishers.

       Research / Lobbying / Marketing
  • Expanded (systematic, mixed-methods) research into barriers/facilitators to the adoption and sustaining of open educational resources/approaches.
  • continued awareness raising re: the benefits of open education (and availability of OER as a viable option) directing public education funding to procure or create OER
  • Building faculty confidence in the quality of OER (e.g., more pre-publication formal peer review, listing other institutions that use a particular open textbook)  
        The question is, what are we lobbying for? Is it simply to convince government and institutions to shift their resources spending from commercial publishers to us? Based on what - low cost? Higher quality? Even if these bases are supported, it really feels like a shift from one sort of commercial production to another. We're no different from any other lobbying industry. Worse, I think direct funding to content creators is probably not the most sustainable solution for OERs, as they must be in constant production. Open access to education could be empowering and revolutionary, but not if it's presented as just another way to produce content.

        Defense against enclosure and commercialization
  • putting open content in proprietary platforms that restrict access and/or violate student data privacy
  • Dealing with competition from commercial publishers (e.g., inclusive access). These publishers have large sales forces that far outnumber OER advocates on a campus. A BIG challenge is weaning faculty away from publishers' turnkey solutions. Publishers have large dedicated development staff to create ancillary materials such as Powerpoints, test banks, and online interactive practice tools that feed student scores back to the LMS. Publishers (e.g. Pearson) may be shooting themselves in the foot by ceasing printing textbooks - many students still want a physical copy.
  • “Fauxpen” content and “openwashing” -- confusion among faculty over what’s free vs. what’s truly open 
  • Technologies for finding and using OER often are not compatible with the technology used by those with sensory or physical challenges.
        These issues again pose a marketing or promotion challenge for OER, as if it were just another type of content production enterprise, instead of a mechanism for making fundamental changes in the nature of education itself (changing it from being something only a few people can access, to something everybody can access). The true issue here is: how can we design OERs so they can't be enclosed in this way? Can we depend on market forces, legal forces and political forces? Doubtful.

       Legal and Ethical

  • As faculty using OER grows, knowledge base around copyright issues (it is a legal area and not too many non-law faculty are well-acquainted with issues)
  • Community standards around acceptable learning data: in-course use, collection, sharing, retention, research use.
  • More clarity around open licensing… there is still a great deal of confusion around how this works, particularly when materials are being developed by faculty as part of their teaching loads. Work for hire? Also, what happens “down stream?” What do I do when someone takes my text and revises it to incorporate bias and/or incorrect facts? These are questions we get a lot from faculty
  • Academic Integrity: Students need help understanding how to interact with open materials in terms of attribution/citation, etc., and faculty need help teaching this. “If it’s ‘open,’ can’t I just copy and paste?” We’re seeing increasing academic integrity issues on our campuses and folks are pointing to OERs as one of the culprits.
  • As government investment continues to increase in OER and expectation of lawmakers on adoption grows, how to maintain institutional autonomy and faculty academic freedom.
  • Insuring OER adoption does not become an economic divide. For example, California legislators suggested in 2016 and reflected in the new budget that prisoner education should use OER so the state would not have to spend as much on prisoner education.
        We need to think of ways of getting the legal and ethical issues out of the system entirely, rather than thinking of trying to entangle people in more legal and ethical issues. Wanting to teach and learn should be simple and basic things that anybody can do, not enterprises fraught with legal and moral risk. We should be asking questions like: how can we make OERs work without worrying about licensing? How can we make it so people can use and reuse without worrying about the morality of it all? Are there ways we can insulate OERs and education generally from budget considerations?
18 Jan 23:41

Negotiate your salary like a 6-year old

You’re applying for jobs, you’re hoping to get an offer soon—and when you do you’ll have to face the scary issue of negotiation.

If you read a negotiation guide like Valerie Aurora’s HOWTO (and you should!) you’re told you need to negotiate: you need to ask for more. And that’s good advice.

The problem is that asking for more is scary: it feels confrontational, your body will kick in with an unhelpful adrenaline surge, it’s just not comfortable. And honestly, given you only negotiate your salary every few years, that adrenaline surge probably isn’t going to go away.

But I think you can get past that and negotiate anyway—by embracing your inner 6-year old. In particular, a 6-year old negotiating for snacks on a playdate.

Snack time!

Let’s set the scene.

A 6-year old named Emma is visiting her friend Jackson, and now it’s time for a snack. Jackson’s parent—Michael—now needs to feed Emma.

Michael would prefer the children not eat crackers, but he has a problem. Michael has some authority over Jackson since he’s his parent, and some knowledge of what Jackson is willing to eat. So he can say “you’re eating one of these mildly healthy snacks” and that’s the end of it.

But Emma is just visiting: Michael has less authority, less knowledge, and a hungry 6-year old is Bad News. So Michael comes up with an acceptable range of snacks, and tries his best to steer towards the ones he considers healthier.

The conversation goes something like this:

Michael: “Would you like some fruit?”

Emma: blank stare.

Michael: “How about same cheese?”

Emma: shakes her head.

Michael: “Yogurt?”

Emma: shakes her head.

Michael: “Crackers?”

Emma and Jackson: “Yes!”

Michael has committed to feeding Emma something, he doesn’t want her to go hungry—but he doesn’t have the normal leverage he has as a parent. As a result, Emma can just stall until she gets what she wants. Particularly enterprising children will ask for candy (even when they would never get candy at home!), but stalling seems well within the capacity of most 6-year olds.

Salary time!

The dynamic of hiring a new employee is surprisingly similar.

Whoever is making the offer—HR, an internal recruiter, or the hiring manager—has already committed to hiring you. They’ve decided: they had interviews and meetings and they want to get it over with and just get you on board.

So they come up with an acceptable salary range, and offer you the low end of the range. If you accept that, great. And if you say “can you do better?”

Well, they’ve already decided on their acceptable salary range: they’ll just move up within that range. They’re not insulted, they’re used to this. They’re not horrified at a hole in their budget, this is still within their acceptable range.

You went from fruit to crackers, and they can live with that. All you have to do is ask.

Embrace your inner 6-year old

Much of what determines your salary happens before you get the offer, when the decision is made about what salary range to offer you.

You can influence this by the language on your resume, by how you present yourself, how you interview, and by noting you have competing offers. It may not feel like negotiation, but it’s all part of the process—and while it’s a set of skills you can master as an adult, that part is far beyond what your 6-year old self could do.

But the actual conversation about salary? Pretend you’re 6, pretend it’s a snack, and ask for more—chances are you’ll get some delicious crackers.

There’s always too much work to do: too many features to implement, too many bugs to fix—and working evenings and weekends won’t help.

The real solution is working fewer hours. Learn how you can get a 3-day weekend.

18 Jan 23:41

Adding an ‘On This Blog Today In…’ Widget

by Ton Zijlstra

Alan Levine recently posted his description of how to add an overview to your blog of postings from previous years on the same date as today. He turned it into a small WordPress plugin, allowing you to add such an overview using a shortcode wherever in your site you want it. It was something I had on my list of potential small hacks, so it was a nice coincidence my feedreader presented me with Alan’s posting on this. It has become ‘small hack’ 4.

I added his WP plugin, but it didn’t work as the examples he provided. The overview was missing the years. Turns out a conditional loop that should use the posting’s year, only was provided with the current year, thus never fulfilling the condition. A simple change in how the year of older postings was fetched fixed it. Which has now been added to the plugin.

In the right hand sidebar you now find a widget listing postings from earlier years, and you can see the same on the page ‘On This Blog Today In‘. I am probably my own most frequent reader of the archives, and having older postings presented to me like this adds some serendipity.

From todays historic postings, the one about the real time web is still relevant to me in how I would like a social feed reader to function. And the one about a storm that kept me away from home, I still remember (ah, when Jaiku was still a thing!).

Adding these old postings is as simple as adding the shortcode ‘postedtoday’:

There are 3 posts found on this site published on January 18

  • January 18, 2009
    • On the Real Time Web This post is triggered by Fridays posting on RWW by Bernarnd Lunn ‘Sorry Google, You Missed the Real-Time Web!’. In it Lunn rightly describes how keeping track of things that are happening right now is a new area of innovation, where big incumbents like Google don’t have much to bring to the table. (Consistent with […]
  • January 18, 2007
    • Jaiku As Story Teller (or: Jaiku Works) Mark Wubben posted this great and telling screenshot, which precisely shows what Jaiku does. (just as this one about what Plazes does) Today a big storm is crossing the Netherlands. On behalf of the police the national railsystem has been halted. As I was on the other side of the country today I am stuck […]
  • January 18, 2004
    • Jay Allen’s Spam Filter Added After having been hit with a considerable amount of commentspam starting on New Years Eve, I’ve finally took the 15 minutes it costs to install Jay Allen’s comment and trackback spam filter. Tested it, and it works fine. A big thanks to Jay Allen for this effort! Latest developments in the fight against spam in […]
  • 18 Jan 23:40

    Name That Skyline

    by Gordon Price

    Another example of a very familiar skyline that has changed so radically in the last few years, it seems to be a different city.


    Click to see its context.

    18 Jan 23:40

    The Shopping Channel discounts Bose QuietComfort 25 noise-cancelling headphones to $170

    by Andrew Mohan

    Bose’s QuietComfort 25 acoustic noise-cancelling headphones are on sale for CAD $169.99 as The Shopping Channel ‘Today’s Showstopper’.

    The original sale price of the headphones is $299.99.

    While it doesn’t have the Bluetooth or VPA capabilities like the QuietComfort 35, the headphones still sport an inline mic/remote for calls and controlling music playback.

    It’s important to note that the QuietComfort 25 is designed for use with the iPhone, iPad, and iPod. As such, the mic/remote will not work with Android devices.

    Included in the box are the headphones, the 1.42m incline mic/remote cable, airline adapter, a carrying case, and AAA batteries.

    Source: The Shopping Channel

    The post The Shopping Channel discounts Bose QuietComfort 25 noise-cancelling headphones to $170 appeared first on MobileSyrup.

    18 Jan 23:39

    Microsoft is ending support for Windows 10 Mobile in December 2019

    by Jonathan Lamont
    Microsoft logo

    Microsoft is officially ending support for Windows 10 Mobile devices in December 2019.

    Despite stopping development of new features and hardware in 2017, the company has continued pushing out security and software updates. As of December 10th, these updates will no longer come to users.

    Further, a Microsoft support page noted that Windows 10 Mobile version 1709, released in Octobober 2017, was the last update.

    Additionally, Microsoft recommends Windows 10 Mobile users switch to iOS or Android devices.

    When the company ends support in December, some features like device backups for settings, along with some apps, will continue until March 10th, 2020. The company also says that some features like photo backup and restoring a device from an existing backup may continue to work for up to another 12 months after the end of support.

    Microsoft has focused on Android and iOS for some time and has released several apps for both platforms. Further, the company has turned to Android as the mobile version of Windows, releasing a launcher on the platform and developing PC-phone connectivity software as well.

    While Windows 10 Mobile users now face a choice between iOS and Android, for those who want to stay in Microsoft’s ecosystem, Android is the clear choice.

    Source: Microsoft Via: The Verge

    The post Microsoft is ending support for Windows 10 Mobile in December 2019 appeared first on MobileSyrup.

    18 Jan 23:39

    Linux apps to get GPU support in Chrome OS soon

    by Jonathan Lamont

    Following a report late last year that Linux apps on Chrome OS would get GPU support, a new commit — which adds the latest code changes to a source code — spied by About Chromebooks reveals that GPU support could arrive soon.

    The new feature should arrive shortly in Chrome OS Canary — an early, experimental build of Chrome OS used for testing. Further, GPU support will only come to select high-end Chromebooks.

    According to the commit posted on the Chromium Gerrit, an online collaboration tool for Chrome developers, the GPU support feature was merged for ‘Nami’ and ‘Eve’ boards.

    Eve represents the Pixelbook, which we already suspected would get the feature. However, several manufacturers use the Nami board in their Chromebooks, including:

    • Dell Inspiron 14
    • Lenovo Yoga Chromebook C630
    • Acer Chromebook 13
    • Acer Chromebook Spin 13
    • HP X360 Chromebook 14

    GPU support is a long-requested feature on Chromebooks that support Linux apps. As it currently stands, several Linux apps either don’t work or work poorly because they can’t access the GPU.

    However, with the introduction of this feature, that should be set to change.

    Source: Chromium Gerrit Via: About Chromebooks, 9to5 Google

    The post Linux apps to get GPU support in Chrome OS soon appeared first on MobileSyrup.

    18 Jan 23:38

    A circular firing squad of stupidity

    mkalus shared this story .

    Just when you think there is no further your expectations can fall, they find a way to surpass them. It's hard to imagine any two political figures behaving with less conviction or decency than Theresa May and Jeremy Corbyn this week. We must have truly done something terrible in a previous life to deserve this.

    At any other time, you'd presume that we'd reached some sort of nadir, a basement dungeon of political decency, beyond which it was impossible to fall. But this is Brexit, so they will invariably exceed themselves next week.

    May's attempt to 'reach out' in the wake of her historic defeat this week was as self-interested and false as it was possible to imagine. She refused to make any changes to her proposals at all. She was not willing to change her policy in the slightest.

    Instead, she appeared only interested in using the talks for short-term political advantage. A Downing Street statement was arranged for just hours after her invite, seemingly with the sole intention of pointing out to people that Corbyn had not attended yet. Those politicians who did attend talks - The Green's Caroline Lucas, the Lib Dem's Vince Cable, the SNP's Ian Blackford, and others - said there was nothing new on the table and that they wouldn't bother attending again unless something changed.

    Corbyn saw how badly the prime minister was failing to live up to the moment and tried to match it. There is a kind of solidarity of irresponsibility between the two. Or perhaps it is a competition.
    The Labour leader demanded that the prime minister "rule out" no-deal as a precondition for talks.

    This is a standard tactic of parties who do not want to engage. It's equivalent to when governments demand a terrorist group renounces violence before entering into talks with it. It treats the ultimate aim as a precondition of holding negotiations to achieve it.

    These talks are a way to stop May from allowing no-deal. They are a chance for opposition parties to get her to pull on the brakes. But there is no point simply demanding that as a precondition of holding the talks at all. That is a party political game.

    Even if May said she ruled out no-deal, it would be a meaningless promise. No-deal happens automatically on March 29th as a result of the Article 50 notification. It is the default outcome. To stop it, we need a parliamentary majority for something. That could be a motion demanding the government petition for an extension to Article 50. Or it could be legislation to hold a People's Vote, with an attached extension. Or perhaps a new mandate for talks on customs union and single market membership, again with an attached extension provision.

    Even if we were simply to revoke Article 50, we would still need a parliamentary majority. The European Court of Justice ruling allowing the UK to unilaterally cancel Brexit stated that it had to do that in accordance with its constitutional arrangements. And the Gina Miller case, back in the halcyon days of 2017, showed that governments cannot use the royal prerogative to overrule a decision which has been taken by parliament. In this case, parliament voted to trigger Article 50. So either by a motion or a piece of legislation - there is a lively legal debate on exactly which - there needs to be a majority even to cancel Article 50.

    Of course, none of this really matters. Whether Corbyn holds talks with May or not is immaterial. She refused to give the talks meaning, because she ruled out making any changes to her policy on the basis of them. And he ruled out taking part in the talks by saying he would not participate unless she made the changes in advance. This has been an utter waste of our time.

    They are the most inadequate, self-interested, unimaginative, unprincipled, irresponsible party leaders in living history. There is no thesaurus in the world which could contain all the descriptions of their failures. In a moment which requires towering political figures, we're lumped with them: a prime minister with the intellectual status of a pebble and an opposition leader with the cerebral qualities of crumbled paper.

    It cannot possibly be any clearer that these two figures have failed the country and are incapable of living up to the historic moment they find themselves in.

    It is now up to MPs to take the lead. There are proposals floating around Westminster to give parliamentarians control over the Commons timetable. This could be attached as an amendment to May's motion on what she plans to do next. It is imperative that this now passes. There are no other routes of preventing disaster. Parliament must formally take control.

    This moment has shown us what terrible political leaders we have ended up with. When things are quieter, we can ask ourselves some searching questions about how that came to pass. For now, there is just one cheerful thought. Their inadequacy might force a constitutional change which gives parliament lasting powers over the executive. That would be the closest thing we ever find to a Brexit dividend.

    Ian Dunt is editor of and the author of Brexit: What The Hell Happens Now?

    The opinions in's Comment and Analysis section are those of the author and are no reflection of the views of the website or its owners.

    18 Jan 23:38

    inside the secret Fitzgerald probe the Navy doesn’t want you to read

    mkalus shared this story .

    A scathing internal Navy probe into the 2017 collision that drowned seven sailors on the guided-missile destroyer Fitzgerald details a far longer list of problems plaguing the vessel, its crew and superior commands than the service has publicly admitted.

    Obtained by Navy Times, the “dual-purpose investigation” was overseen by Rear Adm. Brian Fort and submitted 41 days after the June 17, 2017, tragedy.

    It was kept secret from the public in part because it was designed to prep the Navy for potential lawsuits in the aftermath of the accident.

    Unsparingly, Fort and his team of investigators outlined critical lapses by bridge watchstanders on the night of the collision with the Philippine-flagged container vessel ACX Crystal in a bustling maritime corridor off the coast of Japan.

    Their report documents the routine, almost casual, violations of standing orders on a Fitz bridge that often lacked skippers and executive officers, even during potentially dangerous voyages at night through busy waterways.

    The probe exposes how personal distrust led the officer of the deck, Lt. j.g. Sarah Coppock, to avoid communicating with the destroyer’s electronic nerve center — the combat information center, or CIC — while the Fitzgerald tried to cross a shipping superhighway.

    When Fort walked into the trash-strewn CIC in the wake of the disaster, he was hit with the acrid smell of urine. He saw kettlebells on the floor and bottles filled with pee. Some radar controls didn’t work and he soon discovered crew members who didn’t know how to use them anyway.

    Fort found a Voyage Management System that generated more “trouble calls” than any other key piece of electronic navigational equipment. Designed to help watchstanders navigate without paper charts, the VMS station in the skipper’s quarters was broken so sailors cannibalized it for parts to help keep the rickety system working.

    Since 2015, the Fitz had lacked a quartermaster chief petty officer, a crucial leader who helps safely navigate a warship and trains its sailors — a shortcoming known to both the destroyer’s squadron and Navy officials in the United States, Fort wrote.

    Fort determined that Fitz’s crew was plagued by low morale; overseen by a dysfunctional chiefs mess; and dogged by a bruising tempo of operations in the Japan-based 7th Fleet that left exhausted sailors with little time to train or complete critical certifications.

    To Fort, they also appeared to be led by officers who appeared indifferent to potentially life-saving lessons that should’ve been learned from other near-misses at sea, including a similar incident near Sasebo, Japan, that occurred only five weeks before the ACX Crystal collision, Fort wrote.

    <img src="" alt="The guided-missile destroyer Fitzgerald sails limps back to Japan following a collision with a merchant vessel on June 17, 2017. (Kazuhiro Nogi/AFP/Getty Images)"/>

    Fort’s work took on added urgency after another destroyer assigned to the 7th Fleet, the John S. McCain, collided with the Liberian-flagged tanker Alnic MC on Aug. 21, 2017, killing 10 more American sailors.

    But it remained an internal file never to be shared with the public.

    Pentagon officials declined to answer specific questions sent by Navy Times about the Fort report and instead defended the decision to keep the contents of the report hidden from public scrutiny.

    “The Navy determined to retain the legal privilege in order to protect the legal interests of the United States, but provided information regarding the causes and lessons learned to families of those sailors, the Congress and the American people, again to make every effort to ensure these types of tragedies to not happen again,” said Navy spokesman Capt. Gregory Hicks in a prepared written statement to Navy Times.

    In the 19 months since the fatal collision, the Navy’s Readiness Reform Oversight Council has made “significant progress” in implementing reforms called for in several top-level Navy reviews of the Fitzgerald and McCain collisions — nearly 75 percent of the 111 recommendations slated to be implemented by the end of 2018, Hicks added.

    Navy Times withheld publication of the Fort report’s details until Pentagon officials could brief the families of the dead Fitz sailors about the grim findings.

    Sailors Xavier Martin, Dakota Rigsby, Shingo Douglass, Tan Huynh, Noe Hernandez, Carlos Sibayan and Gary Rehm drowned in the disaster.

    Coppock pleaded guilty to a dereliction of duty charge at court-martial last year.

    The Fitz’s commanding officer, Cmdr. Bryce Benson, and Lt. Natalie Combs, who ran the CIC, are battling similar charges in court but contend unlawful command influence by senior leaders scuttled any chance for fair trials.

    When Fort arrived at her CIC desk, he found a stack of paperwork Combs abandoned: “She was most likely consumed and distracted by a review of Operations Department paperwork for the three and a half hours of her watch prior to the collision,” Fort wrote.

    <img src="" alt="Family, friends and shipmates attended a memorial ceremony at Fleet Activities Yokosuka on June 27, 2017, for the seven sailors killed when the guided-missile destroyer Fitzgerald collided with a merchant vessel off the coast of Japan 10 days earlier. (Navy)"/>

    Although Fort’s report drew parallels to a 2012 non-fatal accident involving the destroyer Porter and the supertanker M/V Otowasan in the Strait of Hormuz, his investigation focused on a near-miss by the Fitzgerald near Sasebo on May 10, 2017.

    During that incident, an unnamed junior officer “became confused by the surface contact picture” of vessels surrounding the destroyer and summoned the warship’s then-commanding officer, Cmdr. Robert Shu, to the bridge, according to Fort.

    Shu set the course to steer the Fitz behind the merchant vessel and then left the bridge.

    But once the officer in charge had cleared the other ship’s stern, he “became immediately aware that another vessel was on the opposite side” of the ship they had just dodged, Fort wrote.

    “(The officer) sounded five short blasts and ordered all back full emergency to avoid collision,” something Lt. j.g. Coppock failed to do weeks later when the ACX Crystal loomed out of the darkness, the report states.

    To Fort, the earlier incident should’ve been a wakeup call for both Shu and Cmdr. Benson, his executive officer who would soon “fleet up” to replace him as skipper, plus Benson’s future second-in-command, Cmdr. Sean Babbitt.

    “FTZ’s command leadership was unaware of just how far below standards their command had drifted,” wrote Fort, a surface warfare officer with more than a quarter-century of experience. “Had the (commanding officer) and (executive officer) critiqued the near-collision, they may have identified the root causes uncovered by this investigation.”

    When contacted by Navy Times, Shu recalled the incident that took place just east of the Tsushima Strait, “a normally busy and recognized waterway.”

    “As I was heading down the ladderwell to my cabin, I heard five short blasts and felt the ship back,” Shu said. “I ran back up to the bridge and there was another vessel behind the one we had just maneuvered for.”

    Although Shu couldn’t recall how close the two vessels got to each other, he insisted that the incident wasn’t a near-collision and that his bridge team “reacted appropriately” and later assured him that they had a good picture of the vessels around their destroyer.

    But Fort’s investigation pointed to a disturbing pattern of watchstanders failing to follow standing orders from a skipper and XO who often were inexplicably absent from the bridge, even when the warship was transiting potentially dangerous waters at night.

    One junior officer spoke of a similar near-collision during low visibility, when a watch team finishing their shift failed to identify a vessel that was closing on them and wasn’t being tracked, according to the report. The oncoming officer of the deck maneuvered out of the vessel’s way but never notified the commanding officer.

    Watchstanders admitted to knowing of other instances when ships got close enough to trigger a call to the CO, but they never made it, according to the report.

    “Procedural compliance by Bridge watchstanders is not the norm onboard FTZ, as evidenced by numerous, almost routine, violations of the CO’s standing orders,” not to mention radio transmissions laced with profanity and “unprofessional humor,” Fort found.

    <img src="" alt="The Arleigh Burke-class guided-missile destroyer Fitzgerald was heavily damaged during a June 17, 2017, collision with a merchant vessel."/>

    Benson and predecessor Shu spent little time on the bridge during nighttime transits and Benson was asleep in his quarters on the fateful night the Fitzgerald collided with the ACX Crystal, Fort wrote.

    Some of Benson’s bridge team had never transited the busy waterway before, or had only done so during the day, and “his watchstanders were at least as fatigued as he was from a long day of operations without sufficient rest,” Fort found.

    It also was Benson’s first transit from Sagami Bay to the open sea as the warship’s skipper, a command he assumed just a few days after the near-collision off Sasebo.

    “It is inexplicable that neither Benson nor (executive officer Cmdr. Babbitt) were on the bridge for his first outbound Yokosuka transit as CO, at night, in close proximity to land, and expecting moderately dense fishing and merchant traffic,” Fort wrote.

    Ship travel is governed by the “rules of the road,” a set of guidelines regarding speed, lookouts and other best practices to avoid collisions, but Fort’s report casts doubt on whether watchstanders on board the Fitz and sister warships in the 7th Fleet had sufficient knowledge of them to safely navigate at sea.

    About three weeks after the ACX Crystal disaster, Fort’s investigators sprang a rules of the road pop quiz on Fitz’s officers.

    It didn’t go well. The 22 who took the test averaged a score of 59 percent, Fort wrote.

    “Only 3 of 22 Officers achieved a score over 80%,” he added, with seven officers scoring below 50 percent.

    The same exam was administered to the wardroom of another unnamed destroyer as a control group, and those officers scored similarly dismal marks.

    The XO Babbitt, Coppock and two other officers refused to take the test, according to the report.

    Reached by email, Babbitt told Navy Times that he declined because of the investigation and the fact that Fort had read him his rights.

    “The exam was also given weeks after the collision when the wardroom had not been concentrating on the rules of the road,” he said. “The crew had been pulled from event to event to include the memorial service and the dignified send off and the last thing anyone had been thinking about was how many lights a 50 meter towing vessel on inland waterways should have.”

    Speaking through his defense attorney, Benson declined to comment on the Fort report’s findings.

    In an email to Navy Times, Lt. Cmdr. Justin Henderson said Benson “has never declined or avoided the responsibility that is the burden of command at sea” and remains “accountable for the Fitzgerald and her crew, who remain at the forefront of his thoughts.”

    18 Jan 23:37

    Eat food and watch Netflix simultaneously with this fork and spoon-equipped smartphone case

    by Patrick O'Rourke

    Have you ever looked at your smartphone and thought, “man, I really wish my phone case had a removable fork and spoon built into it?”

    No? Me neither.


    As we’ve seen in the past with other wacky products that don’t exactly make sense, this hasn’t stopped someone from creating a Kickstarter focused on the case I described above, complete with an eye-roll inducing name — the ‘Sphoon_phork.’ That said, the case’s Kickstarter page is still somewhat amusing, though it has only raised $122 CAD of its $14,686 goal three days into its campaign. The campaign is set to end on February 14th.

    “For over 12,000 years, humankind has strived to create the perfect civilization. We invented agriculture. We invented the wheel. We created a system of laws. We sent people to space. Now, a new chapter in our history has begun. Ladies and gentlemen, we bring you: the Sphoon_phork,” reads an excerpt from the case’s Kickstarter Page.

    The page then goes on to state that it took the team behind the Sphoon_phork roughly “a year to achieve the perfect design for a seamless experience of simultaneously eating while browsing.” It seems that the case’s ultimate goal is to allow you to continue to scroll through Instagram while still mowing down on food.

    The way the actual case works seems relatively straightforward as you may have already guessed. The spoon or fork attaches to the base of the smartphone case, allowing you to awkwardly still use your chosen utensil while watching Netflix’s new Fyre Festival documentary. As far as I can tell, that really is all there is to it.

    The Kickstarter page states that the case will launch for the iPhone 7, iPhone 8, iPhone X and iPhone XS first, with plans to release additional versions in the future, including specific cases designed for the Samsung Galaxy S9, Galaxy S9+, Note 9, Google Pixel 3, Pixel 3 XL and a variety of other Android devices.


    It also looks like the Sphoon_phork will first release in white, though the ridiculous case’s Kickstarter page says that “more funky colours” are coming in the future.

    Kickstarter Early Bird pricing starts for the Sphoon_phork starts at $39 CAD with a June 2019 shipping date.

    What a time to be alive.

    Image credit: Kickstarter 

    Source: Kickstarter

    The post Eat food and watch Netflix simultaneously with this fork and spoon-equipped smartphone case appeared first on MobileSyrup.

    18 Jan 23:37

    The 3D Printed Fountain Pen Works!

    by (Peter Rukavina)

    Earlier in the month I printed myself a fountain pen on my 3D printer. Without a nib assembly–the only part I couldn’t print myself–I wasn’t able to actually use the pen, however. This changed today when a Rosetta JoWo #6 Fountain Pen Nib, Ruthenium, Broad assembly arrived in the post from It screwed right into the barrel of the pen, I installed a J. Herbin larmes de cassis ink cartridge and, voila, I had myself a new pen:

    Example of writing with the 3D printed fountain pen

    J. Herbin translates its larmes de cassis ink as “tears of blackcurrant,” which has got to be about the best name for an ink I can imagine; they continue: “this ink with a dark purple hue is the symbol of grievance and penitence.”

    I will use the pen accordingly.

    Regarding the Ruthenium-plated nib, my choice, it seems, was predestined; from Wikipedia:

    Ruthenium is a chemical element with symbol Ru and atomic number 44. It is a rare transition metal belonging to the platinum group of the periodic table. Like the other metals of the platinum group, ruthenium is inert to most other chemicals. Russian-born scientist of Baltic-German ancestry Karl Ernst Claus discovered the element in 1844 at Kazan State University and named it after the Latin name of his homeland, Ruthenia.

    My great-grandfather John’s record in the 1911 census shows Galacia as his “country of origin,” and Ruthenian as both his “racial or tribal origin” and “nationality.” He was born in Серафинці (Serafyntsi) in what is now western Ukraine, and arrived in Canada in 1908.

    Detail from 1911 census showing details about my grandfather