This is maybe the most exciting essay on tech I’ve ever read and crystallizes so many thoughts and feelings I have had around tech that have only really come into focus the last year or so. It makes me super excited about a future in which tech is additive to life instead of brain-rot inducing. The vibe shift is already underway and it’s a “you can just do things” attitude. Like, you can just build tech that doesn’t induce brain rot. “Your negative externalities are my opportunity” is the key. So ripe for this. In a perfect world, trust and integrity will become key competitive advantages.
“Owning the OS means that you get to set the standards for your ecosystem, which developers adopt if they want to reach your customers.”
There are so many banger quotes in this article, but this one has a ton of potential. I know the people want what the people want as it pertains to things like Google Play Store, but I wonder if that will also change at some point. Nobody needs access to 2 million apps. In that sense, users are almost defenseless against the paradox of choice. It’s anti-productive. I think it’s possible to have exclusivity and curation as a value proposition at some point. Where you have a set of values and standards, and as you say, developers must adhere to those to have a place on your hardware/OS. Thus, to reach your customers. It’s the best form of competition that will hopefully produce completely different products than what is mainstream now.
Basically: Thou shalt not exploit the user.
But I can envision a future in which people think it’s crazy that we subjected ourselves to a non-curated firehose of content. And I think curation can still achieve a non-censorial approach in the sense that 200 or 2,000 apps are better than 2 million, and in those curated selections, you still have access to basically everyone in the world’s ideas and thoughts should you want to find them. But it’s devoid of slop and devs/apps that engage in deliberate brain rot tactics. Like, that can become a norm.
Maybe the competition also then becomes who has the best curation process embedded in their OS/hardware. Who has the best suite. People can critique an OS if it’s discovered that they have some sort of obvious ideological bent. Or critique others if it’s found out they basically just engage in “pay to play” curation.
It’s competitive in the best way. Developers have an entirely new incentive structure that is grounded in something much more meaningful than “let’s get the user to use our app for as much time as humanly possible”. It actually becomes, “how can we help”.
Basically, the biggest risk/barrier here is human folly and selling out to something that is in its nature "bad". Which will probably happen if creators have no connection to something transcendent. Needs to be connection to the ideas you put forth in your Means and Meaning piece.
Just bravo. This is so good. Fingers crossed this all comes to fruition and a better future awaits!
Another quote could be, "Companies who are serious about mental health should make physiologically aligned displays ."
You can't app your way out of hardware constraints. Most people believe apps are addictive. That social media is the problem. But this is like blaming fast foods for being unhealthy while ignoring the processed ingredients. Any fast food made with the right ingredients can be nourishing. A burger made with hyperpalatable (fat carb ratios) processed ingredients triggers overconsumption.
Displays are either hyperpalatable or satiating. Current screens are optimized like processed food. Engineered to be impossible to stop consuming. The physical sensation of looking at most displays is activating. The refresh rates, the color saturation, the brightness curves are not natural. They're the digital equivalent of added sugars and seed oils, designed to override your body's natural satiation signals.
We overconsume social media not because the content is irresistible, but because the display makes it impossible to feel satisfied. The form factor becomes turns you into Eddie Brock (succumbing to your digital symbiotes need). The haptic feedback is engineered for compulsion. No "digital wellbeing" feature or app can overcome this. Have you ever seen someone who uses a timer for IG? It useless friction.
If we're serious about solving the mental health crisis in computing, we need to rebuild from the photons up, the hardware layer. Not better apps on toxic displays but displays designed for satiation rather than endless consumption.
The moat is the ethical foundation that compounds into the future. Tech giants actually can't compete.
Not because they lack resources or talent, but because their foundations carry the burden of irreversibility. Apple and Google made optimization decisions 30 years ago—blue-light displays, engagement-maximizing refresh rates, form factors designed for constant access. . They can't uproot the foundation without invalidating everything built on top. Apple will be like Applebees. In the early 2000's Applebees tried to upscale and lost customers. This will be the faith of big tech if they try to compete on display tech..
Daylight's moat is now. The opportunity to build the foundational layer based on what society actually needs in the present.
1. Displays for wellbeing,
2. Hardware for satiation,
3. Architecture for human flourishing.
That choice compounds forward (like the doodle shows) into which apps succeed, developers thrive, and users stay (value-prop being they don't become retarded). The ethical constraints become structural advantages that deepen with time.
Start with the right foundation today, and every layer built on top inherits those principles as advantages. Start with the wrong foundation decades ago, and every attempt at reform collides with the architecture that made you successful.
That's the moat: being right at the foundational moment, while others are trapped by being first.
Exceptional piece on using hardwarerevenue to subsidize ecosystem development. The core insight about who benefits versus who decides is what makes privacy-focused software fail in markets. Daylight generating $200M+ cashflow to support resonant computing defaults solves the valley of death problem that kills most aligned-incentive platforms. The Bezos inversion of "your negative externalities are my oppurtunity" is quietly brilliant becasue it targets second-order value capture that incumbents structurally cant pursue.
Perhaps, I can surmise this entire argument as is it "HARDWARE that creates it's own simulation for a user to live in? Can it be one of the worlds in the many worlds we inhabit?"
Maybe I don't want my AI that works on my digital notes to ever know what is in my paper notes. What I read on the Web should never take over recommendations on the Kindle and vice versa.
If that's the case, then you are creating a hardware that is not fungible in a user's life. It is only non fungible by user behaviour today but will be non fungible by AI delivered value tomorrow.
Hardware may not necessary be the solution to the negative externalities that shortsighted software companies have spilled over but I do agree “People who are really serious about software should make their own hardware”. Steve Jobs did just that and I think OpenAI will be the next one.
This is maybe the most exciting essay on tech I’ve ever read and crystallizes so many thoughts and feelings I have had around tech that have only really come into focus the last year or so. It makes me super excited about a future in which tech is additive to life instead of brain-rot inducing. The vibe shift is already underway and it’s a “you can just do things” attitude. Like, you can just build tech that doesn’t induce brain rot. “Your negative externalities are my opportunity” is the key. So ripe for this. In a perfect world, trust and integrity will become key competitive advantages.
“Owning the OS means that you get to set the standards for your ecosystem, which developers adopt if they want to reach your customers.”
There are so many banger quotes in this article, but this one has a ton of potential. I know the people want what the people want as it pertains to things like Google Play Store, but I wonder if that will also change at some point. Nobody needs access to 2 million apps. In that sense, users are almost defenseless against the paradox of choice. It’s anti-productive. I think it’s possible to have exclusivity and curation as a value proposition at some point. Where you have a set of values and standards, and as you say, developers must adhere to those to have a place on your hardware/OS. Thus, to reach your customers. It’s the best form of competition that will hopefully produce completely different products than what is mainstream now.
Basically: Thou shalt not exploit the user.
But I can envision a future in which people think it’s crazy that we subjected ourselves to a non-curated firehose of content. And I think curation can still achieve a non-censorial approach in the sense that 200 or 2,000 apps are better than 2 million, and in those curated selections, you still have access to basically everyone in the world’s ideas and thoughts should you want to find them. But it’s devoid of slop and devs/apps that engage in deliberate brain rot tactics. Like, that can become a norm.
Maybe the competition also then becomes who has the best curation process embedded in their OS/hardware. Who has the best suite. People can critique an OS if it’s discovered that they have some sort of obvious ideological bent. Or critique others if it’s found out they basically just engage in “pay to play” curation.
It’s competitive in the best way. Developers have an entirely new incentive structure that is grounded in something much more meaningful than “let’s get the user to use our app for as much time as humanly possible”. It actually becomes, “how can we help”.
Basically, the biggest risk/barrier here is human folly and selling out to something that is in its nature "bad". Which will probably happen if creators have no connection to something transcendent. Needs to be connection to the ideas you put forth in your Means and Meaning piece.
Just bravo. This is so good. Fingers crossed this all comes to fruition and a better future awaits!
Another quote could be, "Companies who are serious about mental health should make physiologically aligned displays ."
You can't app your way out of hardware constraints. Most people believe apps are addictive. That social media is the problem. But this is like blaming fast foods for being unhealthy while ignoring the processed ingredients. Any fast food made with the right ingredients can be nourishing. A burger made with hyperpalatable (fat carb ratios) processed ingredients triggers overconsumption.
Displays are either hyperpalatable or satiating. Current screens are optimized like processed food. Engineered to be impossible to stop consuming. The physical sensation of looking at most displays is activating. The refresh rates, the color saturation, the brightness curves are not natural. They're the digital equivalent of added sugars and seed oils, designed to override your body's natural satiation signals.
We overconsume social media not because the content is irresistible, but because the display makes it impossible to feel satisfied. The form factor becomes turns you into Eddie Brock (succumbing to your digital symbiotes need). The haptic feedback is engineered for compulsion. No "digital wellbeing" feature or app can overcome this. Have you ever seen someone who uses a timer for IG? It useless friction.
If we're serious about solving the mental health crisis in computing, we need to rebuild from the photons up, the hardware layer. Not better apps on toxic displays but displays designed for satiation rather than endless consumption.
The moat is the ethical foundation that compounds into the future. Tech giants actually can't compete.
Not because they lack resources or talent, but because their foundations carry the burden of irreversibility. Apple and Google made optimization decisions 30 years ago—blue-light displays, engagement-maximizing refresh rates, form factors designed for constant access. . They can't uproot the foundation without invalidating everything built on top. Apple will be like Applebees. In the early 2000's Applebees tried to upscale and lost customers. This will be the faith of big tech if they try to compete on display tech..
Daylight's moat is now. The opportunity to build the foundational layer based on what society actually needs in the present.
1. Displays for wellbeing,
2. Hardware for satiation,
3. Architecture for human flourishing.
That choice compounds forward (like the doodle shows) into which apps succeed, developers thrive, and users stay (value-prop being they don't become retarded). The ethical constraints become structural advantages that deepen with time.
Start with the right foundation today, and every layer built on top inherits those principles as advantages. Start with the wrong foundation decades ago, and every attempt at reform collides with the architecture that made you successful.
That's the moat: being right at the foundational moment, while others are trapped by being first.
Towards a better future!
Great job Anjan
🔥🔥🔥
Exceptional piece on using hardwarerevenue to subsidize ecosystem development. The core insight about who benefits versus who decides is what makes privacy-focused software fail in markets. Daylight generating $200M+ cashflow to support resonant computing defaults solves the valley of death problem that kills most aligned-incentive platforms. The Bezos inversion of "your negative externalities are my oppurtunity" is quietly brilliant becasue it targets second-order value capture that incumbents structurally cant pursue.
Perhaps, I can surmise this entire argument as is it "HARDWARE that creates it's own simulation for a user to live in? Can it be one of the worlds in the many worlds we inhabit?"
Maybe I don't want my AI that works on my digital notes to ever know what is in my paper notes. What I read on the Web should never take over recommendations on the Kindle and vice versa.
If that's the case, then you are creating a hardware that is not fungible in a user's life. It is only non fungible by user behaviour today but will be non fungible by AI delivered value tomorrow.
Hardware may not necessary be the solution to the negative externalities that shortsighted software companies have spilled over but I do agree “People who are really serious about software should make their own hardware”. Steve Jobs did just that and I think OpenAI will be the next one.
How does it work with an iPhone. And iCloud email?
The more does need more beautiful things to look at and use, not less