- I used to be convinced that "files" or similar units of storage would remain relevant forever. Today, some people have in their life never encountered the concept of files, or basically anything that would persist when a computer system is turned off and on again. Almost all aspects of computer usage have become transactional, and people prefer to orient themselves in the dimension of time more than anything (although that is crumbling, too).
@despens @entreprecariat do people prefer it or has the capacity for it been systemically stripped away by the mobile OS duopoly? Google minimizes the file system on Android devices because they want everyone to search for everything (which generates intent data they can extract profit from). Apple obfuscates the fs because they don't even really want users to think of their devices as computers (because consumption appliances are their big moneymakers now)
Many users struggled with the burden of organizing their files before these designs were introduced, Apple & Google exploited this to their gain in a similar way Apple, IBM, Microsoft, Sun, and many others exploited users struggling with file management on the command line and gave them a desktop. There was lots of criticism towards that, users would become dependent on GUIs and never be able to understand how the file system actually works, etc.
But indeed, the mobile design patterns have a different ring to them, esp. when it comes to behavioral surveillance. Yet, Apple and Google produced a set of idioms that users prefer over something like Windows CE or Windows Mobile. These designs are based on user research and A/B testing and do represent something that users actually prefer. (Of course this goes hand in hand with the choices presented to them.)
: witnessed this for more than a decade teaching programming to designers and artists, and there's lots of literature about it.
@despens @entreprecariat Agreed, yeah. A key difference between the GUI revolution and the rise of mobile is that GUIs were seen as massive accessibility wins that simply made computers more appealing to people. Whereas with mobile, the two companies were very much aware they were creating new markets (platforms, in the modern sense) whose design had to enact their overall business strategy, a constraint that I don't think was present in the GUIs of the early 80s.
@despens @entreprecariat I agree that there has been some real accessibility benefits of app and task based (rather than document or file system based) workflows, ie what the mobile OSes are built entirely around. But these designs exist only secondarily as affordances and primarily as concretizations of the business model - anything that serves the former but not the latter doesn't stick around.
@jplebreton @entreprecariat Windows 3 was already designed task centric in an effort to establish activities as products. Since the introduction of the "dock" in Mac OS 10.0, I'd say Apple's desktop is also largely task-centric. The first iPhone had no apps and instead promoted rich web applications. Lots of these developments happen iteratively.—And of course task-based matches well with the idea of software subscription.
It could have gone the in another way too, for instance, a subscription-based market could have been created for viewers and editors for different data formats that need to integrate with a file-based main interface (kind of like KDE was originally designed as a more thorough implementation of Windows' OLE architecture). It would have still been possible to track users, and present ads in a file manager (as Windows 7 did for some time).
So if there's need for another arbitrary change of interface idioms to disrupt an existing software market, there are lots of ideas available to re-use, or new ones to invent. The business model of continuously having to pay (money or data) for being able to use your device can be mapped to all types of user interfaces I believe.
@despens @jplebreton @entreprecariat the personal computer was in some ways a response to the time-share model of computers, where you literally did pay for access to data/computation. It was basically as you describe, a subscription based viewer/editor that accessed resources as you paid for them :)
I am wondering how this developement has changed the concept of the «user»?
I mean, with time sharing we had users that were actually programmers, and somewhere later the concept of the end-user came up, meaning something like a software-consumer. Now with data-driven business models, how ist this developing further? Any thoughts about this?
@shusha @emenel @despens @entreprecariat I would definitely check out "Lurking: How a Person Became a User" by @jomc for unique insights on this.
It seems like the biggest shift has been how "user" as defined in a consumer products sense kind of gobbled up every other definition during the 90s, and how that in turn fed the shift to users as resources for (ad) data centric businesses.
@shusha @emenel @despens @jplebreton @entreprecariat a while ago I wrote a bit about this transformation but mostly from the perspective of UNIX filesystem and home dirs, user and process sandboxing, and app culture, check the first half of this https://bleu255.com/~aymeric/dump/mansoux-chroot-rorw-2013.pdf
@jplebreton @entreprecariat I honestly believe that this was the case as well for Microsoft Windows, the Macintosh, Sun, Acorn, etc. Going GUI was a market shakeup that allowed the OS makers to replace existing established software from competitors with their own, tie developers to themselves, and expand their market. Microsoft's OLE design was on the one hand a realization of Kay's dynabook ideas, but also making it very hard to port a Windows application to multiple platforms. Sun tied their OS to their own processor architecture. Apple created a parallel file system universe and selling diskettes with an Apple logo on it for 5x the prices of a regular one.
Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.