Touchscreens were for me a lesson in humbleness: when they were first introduced, I was utterly convinced that they would never take off.

Follow

Inspired by the responses to the previous toot, I open the issue: what was you most blatantly wrong technological prophecy?

Mine:
- I was sure touchscreens would never take off.
- I believed that the idea of registering digital art on the blockchain was ridiculous

· · Web · 8 · 0 · 2

another device I thought it would be a huge flop: the iPad Pro

@entreprecariat 'I believed that the idea of registering digital art on the blockchain was ridiculous'

What changed your mind?

@rra let's put it like this: I still find it ridiculous, but I can't ignore the economy that NFT and stuff are "generating".

@rra from wp: "The NFT market value tripled in 2020, reaching more than $250 million. During the first quarter of 2021, NFT sales exceeded $2 billion."

@entreprecariat @rra but this is very un-evenly distributed. I would guess these numbers are split between very few big whales.

@entreprecariat @rra there was a good article a few months back with charts of how uneven it is but I can't find it now

@entreprecariat @rra the size of the economy doesn't make it a good idea. NFTs are just very enviromentally unfriendly and basically unenforceable derivatives.

The "unenforceable" part is what makes them a borderline scam.

@entreprecariat @rra Ponzi schemes and multi-level marketing schemes were/are also generating huge economies, after all.

@rysiek @rra i agree completely on the fact that they're a bad idea, but reality is full of bad ideas and most of them can't be ignored

@entreprecariat @rysiek @rra if your initial conclusion was simply that it was ridiculous, you were absolutely not wrong. after the hype explosion earlier this year the space is now mostly a cesspool of crypto pump-and-dumpers (NFTs help create a smokescreen of legitimacy around crypto in general) with a few diehard artists still begging for attention at the fringes.

@entreprecariat @rra
I think the Economics Explained video covers this pretty well. NFT art DOESN'T make sense, but neither does the actual physical art market. At the end of the day, people like to signal their wealth and signal the art that they care about. When every copy is identical, the certificate of authenticity becomes very important, and using a digital certificate of authenticity isn't a huge stretch.

@cjd @entreprecariat @rra @jplebreton I found this article very informative concerning the market and who is profiting from it: thatkimparker.medium.com/most-

for me it is quite obvious why NFT and the art market go so well together: they share an obsession with provenance / authenticity, and the generation of value is based on speculation in both cases too. It's a perfect match in economic character, and the talking about the democratization of the art market is a rationalization, that cannot hold its promises, as the numbers show.

@entreprecariat
The success of Steam. I really thought people wanted big card board boxes with their games and manuals in them.

@entreprecariat
- I used to be convinced that "files" or similar units of storage would remain relevant forever. Today, some people have in their life never encountered the concept of files, or basically anything that would persist when a computer system is turned off and on again. Almost all aspects of computer usage have become transactional, and people prefer to orient themselves in the dimension of time more than anything (although that is crumbling, too).

@entreprecariat Interesting! I also observe that the concept of a file system is something held in such high regard that it can serve as an expression of user autonomy. The fact of using file systems itself is used as a signal for "cool stuff", like "host your GeoCities-like website on IPFS" (which is silly empowerment-washing) or "keep your files immutable, forever" (which is ineffective and undesirable)… but in fact it is all more like *SPROING*:

@despens @entreprecariat do people prefer it or has the capacity for it been systemically stripped away by the mobile OS duopoly? Google minimizes the file system on Android devices because they want everyone to search for everything (which generates intent data they can extract profit from). Apple obfuscates the fs because they don't even really want users to think of their devices as computers (because consumption appliances are their big moneymakers now)

@despens @entreprecariat these changes are carefully directed acts of top-down user mindset engineering, not some natural evolution toward universally better ideas. these devices still have file systems under the hood and with the right UI design those systems could still be extremely valuable affordances to users.

@jplebreton @entreprecariat Yes, sure, I agree, but it is not that simple.

Many users struggled with the burden of organizing their files before these designs were introduced[1], Apple & Google exploited this to their gain in a similar way Apple, IBM, Microsoft, Sun, and many others exploited users struggling with file management on the command line and gave them a desktop. There was lots of criticism towards that, users would become dependent on GUIs and never be able to understand how the file system actually works, etc.

But indeed, the mobile design patterns have a different ring to them, esp. when it comes to behavioral surveillance. Yet, Apple and Google produced a set of idioms that users prefer over something like Windows CE or Windows Mobile. These designs are based on user research and A/B testing and do represent something that users actually prefer. (Of course this goes hand in hand with the choices presented to them.)

---

[1]: witnessed this for more than a decade teaching programming to designers and artists, and there's lots of literature about it.

@despens @entreprecariat Agreed, yeah. A key difference between the GUI revolution and the rise of mobile is that GUIs were seen as massive accessibility wins that simply made computers more appealing to people. Whereas with mobile, the two companies were very much aware they were creating new markets (platforms, in the modern sense) whose design had to enact their overall business strategy, a constraint that I don't think was present in the GUIs of the early 80s.

@despens @entreprecariat I agree that there has been some real accessibility benefits of app and task based (rather than document or file system based) workflows, ie what the mobile OSes are built entirely around. But these designs exist only secondarily as affordances and primarily as concretizations of the business model - anything that serves the former but not the latter doesn't stick around.

@jplebreton @entreprecariat Windows 3 was already designed task centric in an effort to establish activities as products. Since the introduction of the "dock" in Mac OS 10.0, I'd say Apple's desktop is also largely task-centric. The first iPhone had no apps and instead promoted rich web applications. Lots of these developments happen iteratively.—And of course task-based matches well with the idea of software subscription.

It could have gone the in another way too, for instance, a subscription-based market could have been created for viewers and editors for different data formats that need to integrate with a file-based main interface (kind of like KDE was originally designed as a more thorough implementation of Windows' OLE architecture). It would have still been possible to track users, and present ads in a file manager (as Windows 7 did for some time).

So if there's need for another arbitrary change of interface idioms to disrupt an existing software market, there are lots of ideas available to re-use, or new ones to invent. The business model of continuously having to pay (money or data) for being able to use your device can be mapped to all types of user interfaces I believe.

@despens @jplebreton @entreprecariat the personal computer was in some ways a response to the time-share model of computers, where you literally did pay for access to data/computation. It was basically as you describe, a subscription based viewer/editor that accessed resources as you paid for them :)

@emenel @despens @jplebreton @entreprecariat. This is a great thread, thank you for your insights!

I am wondering how this developement has changed the concept of the «user»?

I mean, with time sharing we had users that were actually programmers, and somewhere later the concept of the end-user came up, meaning something like a software-consumer. Now with data-driven business models, how ist this developing further? Any thoughts about this?

@shusha @emenel @despens @entreprecariat I would definitely check out "Lurking: How a Person Became a User" by @jomc for unique insights on this.
It seems like the biggest shift has been how "user" as defined in a consumer products sense kind of gobbled up every other definition during the 90s, and how that in turn fed the shift to users as resources for (ad) data centric businesses.

@despens @shusha @jplebreton @entreprecariat I was about to share many of these same links. The book by @jomc is also very excellent!

@despens @shusha @jplebreton @entreprecariat @jomc also worth looking at McKenzie Wark's work on hackers, users, and vectors.

@emenel @despens @jplebreton @friend.camp thank you all for the hints and links! the book by @jomc is already on my desk, and the text by @entreprecariat was the reason I joined lurk in the first place. 😃 So I am looking forward to discussing this topic with you at a later moment.

@shusha @emenel @despens @jplebreton @entreprecariat a while ago I wrote a bit about this transformation but mostly from the perspective of UNIX filesystem and home dirs, user and process sandboxing, and app culture, check the first half of this bleu255.com/~aymeric/dump/mans

@jplebreton @entreprecariat I honestly believe that this was the case as well for Microsoft Windows, the Macintosh, Sun, Acorn, etc. Going GUI was a market shakeup that allowed the OS makers to replace existing established software from competitors with their own, tie developers to themselves, and expand their market. Microsoft's OLE design was on the one hand a realization of Kay's dynabook ideas, but also making it very hard to port a Windows application to multiple platforms. Sun tied their OS to their own processor architecture. Apple created a parallel file system universe and selling diskettes with an Apple logo on it for 5x the prices of a regular one.

@entreprecariat Btw, I also didn't believe in virtual worlds and 3d space to help with remembering where files were stored. Will the interface become more realistic :drake_dislike: versus more ubiquitous. :drake_like:

and I was absolutely right
🔮 🧙

@entreprecariat i still believe that digital art on the blockchain is ridiculous, and that it will either pass as a fad or transform into something else rather quickly.

@emenel already happening for what i understand. NFT art seems to be just a testing ground for general, non-art financial speculation

@entreprecariat Circa 2004 I was reasonably sure that the coming console generation and the jump to HDTV was going to produce a 1983-like market crash. I didn't know enough about the industry or how capital operates in general (ie drinks worker blood and shits money). What did happen though is that lots of smaller independent studios that had thrived in the 80s and 90s died off as they failed to scale to the art production team sizes needed to fill an HD screen/world with content.

@entreprecariat You could also reverse the equation and consider tech that was hyped as the Next Big Thing but was roundly rejected by the general public. Exhibit A: Google Glass.

Sign in to participate in the conversation
post.lurk.org

Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.