Étude #1

An attempt at an étude—short but dense writing—as introduced by Venkatesh Rao on ribbonfarm. It didn’t start off as one; it was supposed to be a short Facebook post, but after some typing, grew into this. (It ended up being posted on Facebook anyway.)


(305 words)

The “digital native” is a fallacy.

There are people who are curious about product features, people who are creative in the way they use digital tools, people who are clear about what they want and need a tool designed exactly for that, people who are insanely flexible in their workflows, people who are insanely rigid in their workflows, …

But there are no digital natives. No one speaks binary as a mother tongue. No one was born in the digital ether. No one is wired for the digital ether.

Our experience of digital information is mediated by architects, programmers, and designers, of information systems and interfaces. Even text-based command interfaces are a challenge in design: lexical and grammatical.

Our social interactions are also mediated by ritual interfaces. The greeting. The introduction. The handshake. The smalltalk dance. The shared-with-public announcement. The friends-only personal celebration. Calling a friend’s name to ping their information stream. Social networks are not a digital thing, but we now have digital interfaces for them.

There are complex interfaces, and there are simple interfaces. There are fixed interfaces, and there are contextual interfaces. There are strict interfaces, and forgiving interfaces. Each unfamiliar interface has to be learnt anew, lest we commit ritual faux pas and are reprimanded to our inter-faces. The socially able are proficient at reading social interfaces. The digitally able are proficient at reading digital interfaces. There is nothing “native” about it; our grammatical skills are native, but the way we apply them to various interfaces are not.

So give the non-native a break. No one is supposed to be native at these things. It’s why we have “etiquette” courses. “Computer” courses are the same thing in different guise.

Be a gracious interface. Learn to make gracious interfaces, learn to teach gracious interfacing. Because none of us are natives at this.

Thresholds in computing: Part 10 – Beyond Thin-ITX

(Part 10 in a series of posts on small-form-factor computing)

In my previous posts on Thin-ITX, I made two complaints:

  1. There is still some cabling required in Thin-ITX, to connect the front panel ports (USB, audio) and power button to the motherboard.
  2. A Thin-ITX system can still be further shrunk, since nothing really limits it to a minimum size of 17×17cm.
  3. There are still more sources of heat than are really required for a Thin-ITX system
    The first two issues have been resolved by another (unofficial) form factor: Intel’s Next Unit of Computing (NUC). The NUCs are barebones small-form-factor systems—just add CPU, RAM, SSD, and water—with motherboards measuring 10×10cm. They carry the full range of connectivity (USB, audio, HDMI/Displayport) enjoyed by most PCs, although perhaps not the full complement.

nuc-hand-ports1

These things are deliciously small. In fact,at  12×11×3.5cm, they are barely larger than the motherboards they house. This is an approach to cabling elimination that we’ve seen before in ITX: If you can’t bring the board closer to the sides, bring the sides closer to the board. Continue reading Thresholds in computing: Part 10 – Beyond Thin-ITX

Thresholds in computing: Part 9 – heat dissipation and Thin-ITX

(Part 9 in a series of posts on small-form-factor computing)

I wish I had the equipment to make the kind of heatmaps that Puget Systems does in their comparison of horizontal vs vertical cooling. But I don’t, so we’ll just have to make do with overlays again.

Heat sources in a passively cooled system
Heat sources in a passively cooled system

Continue reading Thresholds in computing: Part 9 – heat dissipation and Thin-ITX

Thresholds in computing: Part 8 – Thin-ITX vs Mini-ITX

(Part 8 in a series of posts on small-form-factor computing)

When Intel released the first thin-ITX motherboards at Computex in 2011, many folks were left scratching their heads: what use is thin-ITX when we already have ITX? It quickly seemed that aside from use in all-in-one (AIO) systems, thin-ITX was a stillborn idea, consigned to a quiet fizzle-out once Intel had a more robust strategy figured out.

Thin-ITX parts

Today it is still hard to tell if thin-ITX is going to really take off. But what I noticed, assembling my own thin-ITX system, is that it’s not about the “thin” at all. Continue reading Thresholds in computing: Part 8 – Thin-ITX vs Mini-ITX

HDPlex H1.S review: Premium silence

One of the best ways to start any holiday is with the sight of this:

HDPlex H1.S, Asus Q87T, i5-4440S, Kingston SODIMMs
HDPlex H1.S, Asus Q87T, i5-4440S, Kingston SODIMMs

Prior to this, I was already on an ITX build, using the Realan E-Q6, which is actually a more compact case than the H1.S. But in many ways it is an inferior product: poor fit, inadequate tolerances, and a poorly located DC jack. I was also pushed along by a few other motivations: a desire to try a thin-ITX build, and to get a new motherboard that properly supports 1440p (it seems that these days the only way to ensure this is to get a high-end motherboard that has Displayport). But those are stories for another post. Continue reading HDPlex H1.S review: Premium silence

The Google Fortune Teller

The first Sherlock Holmes story I read (excluding the condensed for-kids versions) was The Hound of the Baskervilles. Revisiting my childhood reads, I was expecting to more closely examine the cause of my childhood awe and fascination with the detective mind, but was rather more surprised to find it missing.

Some pages later I (think I) located the cause. “We are coming now rather into the region of guesswork,” said Dr Mortimer. “Say, rather, into the region where we balance probabilities and choose the most likely. It is the scientific use of the imagination, but we have always some material basis on which to start our speculation. Now, you would call it a guess, no doubt, but I am almost certain that this address has been written in a hotel.”

(The classic) Sherlock is an algorithmic/logical mind, primed to generate hypotheses from correlated observations, and famously bored by a lack of interesting cases. Probabilistically he was of course right, but it was more shocking to me that he was never wrong. Therein lies the power of fiction and the privilege of the author, to pick one’s battles and their resulting outcomes. But I do not intend here to take Conan Doyle down a notch, nor to mar the famous detective’s record—there are plenty other accounts to read for that. I bring this up because we have a real-world parallel, a chimeric Sherlock, who makes his own hypotheses, often invisibly. Continue reading The Google Fortune Teller

Thresholds in computing: Part 7 – ITX cooling

(Part 7 in a series of posts on small-form-factor computing)

Previously on Thresholds in computing, we crossed the 120W threshold and shrunk the PSU to picoPSU-size, leaving CPU cooling to be the main factor in determining case size.

More than a year ago, I bought a Realan E-Q6 ITX case, walking my talk and trying to see what it is like, shoehorning a high-end quad-core setup into such a tiny case. I’d tried this earlier, in a previous post on PSU efficiency, but I don’t have the power meter I used to make those measurements any more, and my setup has changed a bit since then. I didn’t actually use a picoPSU, but the 120W LR1005 PSU bundled with the case is about equivalent.

Another thing to note: I am not a hardware reviewer, and don’t make it a point to keep static setups that I can use to assess the performance of various coolers that I come across. All I’m using is lm_sensors reporting and a power meter acquired off eBay. Continue reading Thresholds in computing: Part 7 – ITX cooling