I often find myself glazing over conceptual interfaces for computing because I usually just want to use the thing to see how it feels, but the nice demos on this one stopped me.
The promise of digital systems for me has always had something to do with ‘surfacing the right amount of meaningful things at the right time’. I have approximated this in my apps by requiring explicit actions to surface things because it’s beyond my capacity to imagine how to do this more automatically, and also generally distrust machines to automate this well. So how nice it is to see a vision for creating structures and associations with little friction, more or less by directing your attention. Computers should be good at this while allowing us to tweak things, to avoid relying completely on a black box:
The system can handle most of the heavy lifting by simply paying attention to how we move through our items within different contexts, but we can further manage the associations manually as we like.
Bringing things to view in the way presented here is so much more compelling than clicking around through filesystems or apps. The closest that I’ve seen and used is Quicksilver’s way of 'knowing’ by key combinations and their frequency, but this requires explicit association. Successfully capturing intent passively instead of explicitly makes it so that being a programmer is not necessary.
It’s important to have higher-level primitives baked into lower levels, rather than reconstructing them in each app–this can mean schemas, file formats, or an operating system itself. Your trail or history is valuable and shouldn’t be siloed in or built bespoke for certain apps. How can this be constructed without a universal app for all the things? (or is that just another operating system?) How can this be done in a way where the data is not siloed within this system (even though it seems to afford great flexibility across app boundaries)?