Sarah Crane, Dir., Federal Citizen Information Center, GSA., is going to talk about USA.gov. “In a world where everyone can search and has apps, is a web portal relevant?,” she asks.
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.
When the US web portal (first.gov [archival copy]) was launched in 2000, it had an important role in aggregating and centralizing content. Now people arrive through search.
USA.gov is a platform that offers a full suite of bilingual products, built around a single structured API: information, contacts, social media, etc. All is built around a single API. The presentation layer is independent of the content. Thanks to the API, all the different outputs use the same consistent content.
It’s designed to integrate with other agency content. In fact, they don’t want to be writing any of the content; it should come from other agencies. Also, it’s built to so its modules and content can be reused. And it’s built to scale. It can support expansion or consolidation. E.g., if an initiative loses steam, its content can be pulled in and be kept available.
How people use govt services: They look online, they ask their friends, and they expect it to be easy. People are surprised when it’s complex. Some people prefer in-person help.
So, how does the portal remain relevant?
Customer experience is a core tenant. They recently launched a Customer Experience division. Constant measurement of performance. Fixing what doesn’t work. Clear lines of reporting up to the senior management. The lines of reporting also reach all the way to the devs.
Last year they re-did their personas, based on four different behaviors: 1. Someone who knows exactly what s/he’s looking for. 2. Someone has a general idea, but not informed enough to search. 3. Someone wants to complete a transaction. 4. Someone who wants to contact an elected official. They analyzed the experiences, and did “journey maps”: how someone gets to what she wants. These journeys often include travels into other agencies, which they also mapped.
What’s next for them now that info is cheap and easy to find? Sarah likes Mint.com‘s model:
- Aggregated, personalized content collected from multiple agencies.
- Pre-emptive service – alert, etc.
- Relevant updates as you are in the task.
For further info, see Blog.USA.gov, and USA.gov/Explore
Q: [me] Are people building on top of your API?
A: Some aspects, yes. Heavily used: the A-Z agency index – the only complete listing of every agency and their contact info. There’s a submission to build a machine-readable org chart of the govt that will build on top of our platform. [OMG! That would be incredible! And what is happening to me that I’m excited about a machine-readable org chart?]
Also if you use bit.ly to shorten a gov’t url, it creates one.usa.gov which you can use to track twitter activity, etc.
Certain aspects of the API are being used heavily, primarily the ones that show a larger perspective.
Q: Won’t people find personal notifications from the govt creepy, even though they like it when it’s Mint or Amazon?
A: The band-aid solution is to make it opt-in. Also being transparent about the data, where it’s stored, etc. This can never be mandatory. The UK’s e-verify effort aims at making the top 20 services digital through a single ID. We’d have to study that carefully We’d have to engage with the privacy groups (eg., EPIC) early on.
Q: Suppose it was a hybrid of automated and manual? E.g., I tell the site I’m turning 62 and then it gives me the relevant info, as opposed to it noting from its data that I’m turning 62.
Q: We’re losing some of the personal contact. And who are you leaving behind?
A: Yes, some people want to talk in person. Our agency actually started in 1972 supplying human-staffed kiosks where people could ask questions. Zappos is a model: You can shop fully online, but people call their customer service because it’s so much fun. We’re thinking about prompting people if they want to chat with a live person.
The earliest adopters are likely to be the millennials, and they’re not the ones who need the services generally. But they talk with their parents.
I briefly interviewed Sarah afterwards. Among other things, I learned:
The platform was launched in July
They are finding awesome benefits to the API approach as an internal architecture: consistent and efficiently-created content deployed across multiple sites and devices; freedom to innovate at both the front and back end; a far more resilient system that will allow them to swap in a new CMS with barely a hiccup.
I mentioned NPR’s experience with moving to an API architecture, and she jumped in with COPE (create once, publish everywhere) and has been talking with Dan Jacobson, among others. (I wrote about that here.)
She’s certainly aware of the “government as platform” approach, but says that that phrase and model is more direclty influential over at 18F
Sarah is awesome.
Tagged with: api
Date: October 26th, 2015 dw
“If all machines were to be annihilated at one moment, so that not a knife nor lever nor rag of clothing nor anything whatsoever were left to man but his bare body alone that he was born with, and if all knowledge of mechanical laws were taken from him so that he could make no more machines, and all machine-made food destroyed so that the race of man should be left as it were naked upon a desert island, we should become extinct in six weeks. A few miserable individuals might linger, but even these in a year or two would become worse than monkeys. Man’s very soul is due to the machines; it is a machine-made thing: he thinks as he thinks, and feels as he feels, through the work that machines have wrought upon him, and their existence is quite as much a sine quâ non for his, as his for theirs.”
Samuel Butler, Erewhon, Chapter XXIV, 1872.
This is less rhapsodic than it may seem, for it continues:
“This fact precludes us from proposing the complete annihilation of machinery, but surely it indicates that we should destroy as many of them as we can possibly dispense with, lest they should tyrannise over us even more completely.”
Tagged with: andy clark
Date: October 15th, 2015 dw
The New Atlantis has just published five essays exploring “The Unknown Newton”. It is — bless its heart! — open access. Here’s the table of contents:
Rob Iliffe provides an overview of Newton’s religious thought, including his radically unorthodox theology.
William R. Newman examines the scientific ambitions in Newton’s alchemical labors, which are often written off as deviations from science.
Stephen D. Snobelen — who in the course of writing his essay discovered Newton’s personal, dog-eared copy of a book that had been lost — provides an in-depth look at the connection between Newton’s interpretation of biblical prophecy and his cosmological views.
Andrew Janiak explains how Newton reconciled the apparent tensions between the Bible and the new view of the world described by physics.
Finally, Sarah Dry describes the curious fate of Newton’s unpublished papers, showing what they mean for our understanding of the man and why they remained hidden for so long.
Stephen Snobelen’s article, “Cosmos and Apocalypse,” begins with a paper in the John Locke collection at the Bodelian: Newton’s hand-drawn timeline of the events in Revelations. Snobelen argues that we’ve read too much of The Enlightenment back into Newton.
In particular, the concept of the universe as a pure clockwork that forever operates according to mechanical laws comes from Laplace, not Newton, says Snobelen. He refers to David Kubrin’s 1967 paper “Newton and the Cyclical Cosmos“; it is not open access. (Sign up for free with Jstor and you get constrained access to its many riches.) Kubrin’s paper is a great piece of work. He makes the case — convincingly to an amateur like me — that Newton and many of his cohorts feared that a perfectly clockwork universe that did not need Divine intervention to operate would be seen as also not needing God to start up. Newton instead thought that without God’s intervention, the universe would wind down. He hypothesized that comets — newly discovered — were God’s way of refreshing the Universe.
The second half of the Kubrin article is about the extent to which Newton’s late cosmogeny was shaped by his Biblical commitments. Most of Snobelen’s article is about a discovery in 2004 of a new document that confirms this, and adds to it that God’s intervention heads the universe in a particular direction:
In sum, Newton’s universe winds down, but God also renews it and ensures that it is going somewhere. The analogy of the clockwork universe so often applied to Newton in popular science publications, some of them even written by scientists and scholars, turns out to be wholly unfitting for his biblically informed cosmology.
Snobelen attributes this to Newton’s recognition that the universe consists of forces all acting on one another at the same time:
Newton realized that universal gravity signaled the end of Kepler’s stable orbits along perfect ellipses. These regular geometric forms might work in theory and in a two-body system, but not in the real cosmos where many more bodies are involved.
To maintain the order represented by perfect ellipses required nudges and corrections that only a Deity could accomplish.
Snobelen points out that the idea of the universe as a clockwork was more Leibniz’s idea than Newton’s. Newton rejected it. Leibniz got God into the universe through a far odder idea than as the Pitcher of Comets: souls (“monads”) experience inhabiting a shared space in which causality obtains only because God coordinatis a string of experiences in perfect sync across all the monads.
“Newton’s so-called clockwork universe is hardly timeless, regular, and machine-like,” writes Snobelen. “[I]nstead, it acts more like an organism that is subject to ongoing growth, decay, and renewal.” I’m not sold on the “organism” metaphor based on Snobelen’s evidence, but that tiny point aside, this is a fascinating article.
Tagged with: future
Date: August 18th, 2015 dw
I got a little interested in the question of Isaac Newton’s connection to astrology because of something I’ve been working about casuality. After all, Newton pursued alchemical studies with great seriousness. And he gave us a theory of action at a distance that I thought might be taken as providing a rationale for astrological effects.
But, no. According to a post by Graham Bates:
In a library of 1763 books, (1752 different titles excluding duplicates) he had 369 books on what we would call scientific subjects, plus 169 on Alchemy (including many of the important texts on the subject copied in his own hand), there were also 477 books on Theology. He possessed only four books on astrology; two of these were treatises on astrology, one was an almanac, and one was a refutation of astrology
Bates says that a book on astrology that he purchased as a boy led him to learn about Euclid’s theorems so he could construct an astrologocial chart, but that is the extent of his known interest.
Bates also does a good job tracking down a spurious quote:
There is a story, much quoted in astrological articles and books, about a dispute between Newton and Halley (of the comet fame), supposedly about astrology, in which Newton replies to a remark by Halley “I have studied these things, you have not”.
The actual quote refers to theology, not astrology. So, no, Newton was not practitioner of astrology and there’s no reason to think that he gave it any credence. (Me neither, by the way.)
The post is on the Urania Trust site, which I had not heard of before. The group was founded in 1970 “to further the advancement of education by the teaching of the relationship between main’s [sic] knowledge of, beliefs about, the heavens and every aspect of his art science philosophy and religion.” Given its commitment to taking astrology seriously, the fairness of its post about Newton is admirable.
(Now if I could only find out if Newton played billiards.)
Tagged with: future
Date: August 17th, 2015 dw
In 2008-9, NPR, the NY Times, and The Guardian opened up public APIs, hoping that it would spur developers around the world to create wonderful and weird apps that would make use of their metadata and spread the availability of news.
Very few little happened. By any normal measure, the experiment would have to be deemed a failure.
These three news organizations are nevertheless fervid evangelists for the same APIs—for internal use. They provide an abstraction layer that makes the news media’s back ends far easier to maintain without disrupting their availability to users, they enable these organizations to adapt to new devices and workflows insanely quickly, they facilitate strategic partnerships, they lower the risk of experimentation, and more.
This was the topic of the paper I wrote during my fellowship at The Shorenstein Center. The paper then looks at ways we might still get to the open ecosystem for news that was first envisioned.
The full paper is available freely at the Shorenstein site.
There’s an op-ed length version at Nieman Reports.
Tagged with: future
Date: July 13th, 2015 dw
I wanted to play Tim Berners-Lee’s 1999 interview with Terry Gross on WHYY’s Fresh Air. Here’s how that experience went:
I find a link to it on a SlashDot discussion page.
The link goes to a text page that has links to Real Audio files encoded either for 28.8 or ISBN.
I download the ISBN version.
It’s a RAM (Real Audio) file that my Mac (Yosemite) cannot play.
I look for an updated version on the Fresh Air site. It has no way of searching, so I click through the archives to get to the Sept. 16, 1999 page.
It’s a 404 page-not-found page.
I search for a way to play an old RAM file.
The top hit takes me to Real Audio’s cloud service, which offers me 2 gigabytes of free storage. I decline.
I pause for ten silent seconds in amazement that the Real Audio company still exists. Plus it owns the domain “real.com.”
I download a copy of RealPlayerSP from CNET, thus probably also downloading a copy of MacKeeper. Thanks, CNET!
I open the Real Player converter and Apple tells me I don’t have permission because I didn’t buy it through Apple’s TSA clearance center. Thanks, Apple!
I do the control-click thang to open it anyway. It gives me a warning about unsupported file formats that I don’t understand.
Set System Preferences > Security so that I am allowed to open any software I want. Apple tells me I am degrading the security of my system by not giving Apple a cut of every software purchase. Thanks, Apple!
I drag in the RAM file. It has no visible effect.
I use the converter’s upload menu, but this converter produced by Real doesn’t recognize Real Audio files. Thanks, Real Audio!
I download and install the Real Audio Cloud app. When I open it, it immediately scours my disk looking for video files. I didn’t ask it to do that and I don’t know what it’s doing with that info. A quick check shows that it too can’t play a RAM file. I uninstall it as quickly as I can.
I download VLC, my favorite audio player. (It’s a new Mac and I’m still loading it with my preferred software.)
Apple lets me open it, but only after warning me that I shouldn’t trust it because it comes from [dum dum dum] The Internet. The scary scary Internet. Come to the warm, white plastic bosom of the App Store, it murmurs.
I drag the file in to VLC. It fails, but it does me the favor of tellling me why: It’s unable to connect to WHYY’s Real Audio server. Yup, this isn’t a media file, but a tiny file that sets up a connection between my computer and a server WHYY abandoned years ago. I should have remembered that that’s how Real worked. Actually, no, I shouldn’t have had to remember that. I’m just embarrassed that I did not. Also, I should have checked the size of the original Fresh Air file that I downloaded.
A search for “Time Berners-Lee Fresh Air 1999” immediately turns up an NPR page that says the audio is no longer available.
It’s no longer available because in 1999 Real Audio solved a problem for media companies: install a RA server and it’ll handle the messy details of sending audio to RA players across the Net. It seemed like a reasonable approach. But it was proprietary and so it failed, taking Fresh Air’s archives with it. Could and should have Fresh Air converted its files before it pulled the plug on the Real Audio server? Yeah, probably, but who knows what the contractual and technical situation was.
By not following the example set by Tim Berners-Lee — open protocols, open standards, open hearts — this bit of history has been lost. In this case, it was an interview about TBL’s invention, thus confirming that irony remains the strongest force in the universe.
I finally got to see the Chattanooga Library. It was even better than I’d expected. In fact, you can see the future of libraries emerging there.
That’s not to say that you can simply list what it’s doing and do the same things and declare yourself the Library of the Future. Rather, Chattanooga Library has turned itself into a platform. That’s where the future is, not in the particular programs and practices that happen to emerge from that platform.
I got to visit, albeit all too briefly, because my friend Nate Hill, assistant director of the Library, invited me to speak at the kickoff of Chattanooga Startup Week. Nate runs the fourth floor space. It had been the Library’s attic, but now has been turned into an open space lab that works in both software and hardware. The place is a pleasing shambles (still neater than my office), open to the public every afternoon. It is the sort of place that invites you to try something out — a laser cutter, the inevitable 3D printer, an arduino board … or to talk with one of the people at work there creating apps or liberating data.
The Library has a remarkable open data platform, but that’s not what makes this Library itself into a platform. It goes deeper than that.
Go down to the second floor and you’ll see the youth area under the direction/inspiration of Justin Hoenke. It’s got lots of things that kids like to do, including reading books, of course. But also playing video games, building things with Legos, trying out some cool homebrew tech (e.g., this augmented reality sandbox by 17-year-old Library innovator, Jake Brown (github)), and soon recording in audio studios. But what makes this space a platform is its visible openness to new ideas that invites the community to participate in the perpetual construction of the Library’s future.
This is physically manifested in the presence of unfinished structures, including some built by a team of high school students. What will they be used for? No one is sure yet. The presence of lumber assembled by users for purposes to be devised by users and librarians together makes clear that this is a library that one way or another is always under construction, and that that construction is a collaborative, inventive, and playful process put in place by the Library, but not entirely owned by the Library.
As conversations with the Library Director, Corinne Hill (LibraryJournal’s Librarian of the Year, 2014), and Mike Bradshaw of Colab — sort of a Chattanooga entrepreneurial ecosystem incubator — made clear, this is all about culture, not tech. Open space without a culture of innovation and collaboration is just an attic. Chattanooga has a strong community dedicated to establishing this culture. It is further along than most cities. But it’s lots of work: lots of networking, lots of patient explanations, and lots and lots of walking the walk.
The Library itself is one outstanding example. It is serving its community’s needs in part by anticipating those needs (of course), but also by letting the community discover and develop its own interests. That’s what a platform is about.
It’s also what the future is about.
Here are two relevant things I’ve written about this topic: Libraries as Platforms and Libraries won’t create their own futures.
Tagged with: future
Date: October 7th, 2014 dw
Library Journal has posted an op-ed of mine that begins:
The future of libraries won’t be created by libraries. That’s a good thing. That future is too big and too integral to the infrastructure of knowledge for any one group to invent it. Still, that doesn’t mean that libraries can wait passively for this new future. Rather, we must create the conditions by which libraries will be pulled out of themselves and into everything else.
Tagged with: future
Date: September 22nd, 2014 dw
At Medium.com I have a short piece on what progress looks like on the Internet, which is not what progress used to look like. I think.
I wrote this for the Next Web conference blog. (I’m keynoting their Dec. conference in NYC.)
Tagged with: future
Date: September 8th, 2014 dw
Here’s the video of my talk at The Next Web in Amsterdam on Friday. I haven’t watched it because I don’t like watching me and neither should you. But I would be interested in your comments about what I’m feeling my way toward in this talk.
It’s about what I think is a change in how we think about the future.
Tagged with: future
Date: April 27th, 2014 dw
Next Page »