Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An idea that has been living rent free in my head is that "AI is ultimately nothing but a pure destruction of value". It's promise is unlimited value to everyone on demand; but if everyone can do everything without any effort, it is no longer valuable. Value and scarcity go hand in hand.

I realize the hyperbolic framing of the idea, but none-the-less I haven't been able to get it out of my head. This article feels like it's another piece of the same puzzle.





When something becomes abundant, we focus on something else which is still scarce. That's human nature. Salt used to be scarce and very valuable, but nowadays who thinks about it?

Yes but what happens to society when art, labor, "intelligence" and productivity all become abundant? These are not comparable to salt. You are comparing an apple to every orange tree that has ever existed.

We focus on colonisation of Universe and becoming a Kardashev Type III civilisation. Literally that.

More likely we devolve into Kardashian Type society of shallow attention seekers. Or worse, we move towards a permanent divide of the insanely rich who own the machines and everyone else who struggle to make it through the day.

There are some who would say that both of these have already happened.


krap! maybe earth is ark B

We're just missing around 9001 steps between the devaluation of labor and the space exploration, which according to current understanding of physics is pure fantasy because all concepts of faster then light travel are purely "what-if" daydreams

Yes, many will suffer, many will die. The streets shall be washed in blood. All of that will happen. Immense, unbelievable suffering.

And after all that we still reach Type III and it was all worth it.


The latency problem proves to be insurmountable and we drift apart in woe at that fruitless sacrifices we made, then splinter in forgetting.

No, we don't. Billions die, including you and me, but whatever is left over populates entire galaxy.

Worth it.


Yeah, btw, AI was fantasy as well and yet here we are.

we stole the promethean fire

humans won't have to work again

the spark of intelligence is now living in machines

by the way, we still have to make small annoying interruptions to digital content to display ads, everything falls apart if we don't do that

--

Something feels off in this whole idea. I think you guys are overestimating the importance of AI. It's just another commodity.


Thoughts of chronically sedate individuals.

we currently technically do not need to work much (as a specie).

it s a societal choice (we're not the ones making it but it is still a choice).


That specific phrase is not load-bearing to my argument. It's just an sample of things AI enthusiasts say.

I think LLMs are just another commodity. I think AI would be something different.

Emphasis on the conditional tense.


It wasn't me that muddled the terminology. I ain't going to fix it. They wanted to make those synonyms, that's what they'll get.

> but if everyone can do everything without any effort, it is no longer valuable.

And how is this a bad thing? Would it be good for oxygen to be valuable, or water?

What's with this fetishistic obsession with "value"? What's bad about living in a world where nothing holds any value whatsoever?


Nestle lawyers argue that yes, water should be valuable.

Rockos Basilisk will take care of them.

Not everyone is a nihilist

How is not caring about things having a "value" being a nihilist? If everything is absolutely free for everyone, you see that as a bad thing?

“Nothing holds any value” and “nothing costs any money” are two entirely different ideas. The former is nihilism. The latter is communism.

Neither paint a pretty picture in my book


Define "pretty" picture. Everything free for everyone aka communism sounds amazing and tell me how it's better than what we have now with people starving and child amputees having to beg for pagpag in dumps of Manila.

Ah yes on this glorious day let us convene the committee for the meting out of free items to everyone. Ah shit someone stole them all.

Your snarkiness and cynicism doesn't debunk extrapolated long term trend predictions.

Neither does yours.

> Value and scarcity go hand in hand

Not really. The value to a thirsty soul of water in the desert is as high as they value their own life (to some there is little) and have a currency of value to the seller. Still, once thirst is quenched the value to that soul drops nearer to zero.

For an optional good the value only rises to the point that there is excess asset in the inventories of those that would like to add the option.

I would suggest what you are looking for is that some scarcities are shifted by each new technology. Things like the sincere attention of others or more exclusive emotional attachments become relatively more scarce in a goods abundant existence. Earlier, insights on where to apply the tool and to where one should attend become more scarce.

Something you would have to accept if you believe your statement is that you would never value (i.e. need) water again if we could produce far more than we ever could use. Your body's need and use would not cease even if the economics might collapse.

Financializing everything can lead one into foolish conclusions.


I do not think value and scarcity are identical, merely that they go hand in hand. In the desert, I would pay a lot of money for water. In the west, I would never pay for a glass of water, even if it's blistering hot and I'm parched. The requirement is the same, the value only changes due to scarcity.

What you are talking about is needs, which is an entirely different discussion. Value can also be used to discuss needs (how valuable is something to survive), but I think I'm quite clearly using it to describe something different.


Price and scarcity go hand in hand, not value and scarcity.

Diamonds are pretty worthless but expensive because they're scarce (putting aside industrial applications), water is extremely valuable but cheap.

No doubt there are some goods where the value is related to price, but these are probably mostly status related goods. e.g. to many buyers, the whole point in a Rolex is that it's expensive.


This conflates use-value and exchange-value. Water to someone dying of thirst has extremely high use-value, while a diamond would in that same moment have nearly no use-value, except for the fact that, as a commodity, the diamond has an exchange-value, and so can be sold to help acquire a source of water.

In a sane world we would just give the poor guy some water and let him keep his precious diamond. And in the sane world, the guy would donate the precious diamond to a museum so that everyone could enjoy its beauty.

What are you describing happens if you follow the mathematical rules of your models too much and ignore the real world.


I prefer the `price = value = relative wealth != wealth = resources` paradigm. Thus, wars destroy wealth and tech advances create wealth, but that's just me

Price is just a proxy for value. Diamonds do not have inherent utility (to the layman) but they are expensive because we societally ascribe value to them.

You’re arguing semantics, but not really tying it back to OP’s point regarding AI and price and/or value.

I thought diamonds are not scarce

The way you articulated it, connected with a thought I've had (for over a year now):

AI is like oil, in that it's "burning" a resource that took geological timescales to accrue. Its value derives from the energy-dense and instantaneous act of combusting a fossil fuel, and in this particular part of the "terrain", it will be a local maximum for A Long Time.

Just like how it's taken absurdly long (still very much WIP) for human societies to prioritize weaning ourselves off of fossil fuels, I fear we are going to latch onto GenAI/LLMs pretty hard, and not let go.


I like the comparison with fossil fuels, that feels very accurate. I hope we do not develop the same dependency.

AI is to many professions like the Knitting machines were to textile workers (where the word Luddite originated[0], people who opposed automation).

Some "value" will be lost, but other will be generated. People WILL lose their jobs as what they did can either be done by AI directly or an AI Agent can write a script/program to do some or all what they did.

[0] https://en.wikipedia.org/wiki/Luddite


They were not opposing automation. They were opposing loss of jobs, wages, dignity in work and all that. And they turned out to be completely right. AI is just the next attack from capitalism in a long line of attacks aimed at workers. And this time its coming for every single worker, given that the explicit purpose is to create AI that can replace every single person.

You think anything they did could've put the automation genie back in the bottle?

Barring massive violence against anyone even thinking about automating something, is there anything the could've done realistically?

We're at the same point with AI now, a bit worse really. People are using AI for _everything_ since it's practically free to shove it at any problem and get "meh" results.

Then they do the math whether "meh" basically for free is better than "decent to good" for a liveable wage.

With knitting machines there was an actual monetary and time cost to getting them running so the adoption was slower.

AI adoption is moving at crazy speeds with no regard for anything. Some of the uses will stick and people will lose jobs, some will be scaled back because "meh" quality isn't sustainable practice.

Boycotting products and companies that use AI in a stupid "meh" way will work eventually, but for some fields it's here to stay because it's just better. Programming is one of them, there's no going back to "stupid" Intellisense or plain tab complete when even a local AI model powered system can pre-fill whole functions with 80-100% accuracy in seconds.


The Luddites weren't against automation, they were retaliating against the capital class. Their demands were to have dignified work, not for automation to go away. They attacked the machines because it was the tool the capital class used to deny them their livelihood.

> AI is just the next attack from capitalism ...

Technology, not capitalism.


Not at all. This technology could be deployed in many ways. We have written about benefiting ways it could be deployed to help society for 80 years. It is Capitalism that has decided to deploy it in the worst ways thought up.

'Because something happened in the past when we were fairly undeveloped, when almost nothing in society had been approached systematically, that same thing will happen in the future where we have systematically optimized everything we can. And we should bet civilization on that. Somehow, magically, everything will work out. And if you don't agree with magic, don't want to bet society on hopes of some magic solution appearing, you are the one denying reality and fighting progress'.

I think it's a fundemental misunderstanding of value. Things are useful regardless of their price, the price is speculative but if there is a cost producing something (from earth, the mind or AI) the price will not be zero. Scarcity is the basis of value only for things that have no other utility and cost, for example some crypto made just for pump and dump.

What about land which has high value because of its scarcity?

> It's [sic] promise is unlimited value to everyone on demand

No, it’s not. This is where your concept fails. AI is a tool, like any other tool. It doesn’t provide unlimited anything, and, furthermore, it needs human inputs and direction to provide anything. “Go make me a profitable startup from scratch” is not a useful prompt.


But that is exactly the promise that the heads of the AI labs are making. Sam Altman is repeatedly saying he wants to be able to ask it "go discover a new branch of physics".

Perhaps it's not how you use LLMs, but it is the promise of AI.

For the record, I make a distinction between LLMs (a current tool that we have today) and AI (a promise of some mystical all-powerful science-fiction entity).

There is nothing intelligent about what we have today, artificial or otherwise.


> but if everyone can do everything without any effort, it is no longer valuable

It's called utopia.

But my issue with AI hype is that it's not clear how it will lead to "everyone can do anything." Like how is it going to free up lands so everyone can afford a house with yard if they want?


Everyone productive and working can afford a house with a yard now. You’ll be a few dozen miles from others.

If you want everyone to be able to afford a house with a yard within walking distance of downtown Palo Alto, there aren’t enough of them for everyone that wants to do that, and AI (and utopia) can’t change that. Proximity to others creates scarcity because of basic physical laws. This is why California is expensive.

This is something I always wondered about in Banks’ post-scarcity utopian Culture novels. How do they decide who gets to live next door to the coolest/best restaurant or bar or social gathering place? Does Hub (the AI that runs the habitat, and notionally is the city) simply decide and adjudicate like a dictator or king?


In Ada Palmer's Terra Ignota, part of what transforms society into an utopia is the development of some kind of flying car that can take you anywhere in the world in under 2hrs, making borders irrelevant. This transit system is coordinated by a special group.

I would suspect The Culture would have some means to travel very fast. But you are right that it's never explained. In "The player of games" I think the main character lives in a beautiful house with an incredible view, and I always wondered, how did he get that house?

If you think about, the problem could be solved even now, you could use fast trains to connect small cities, and replace cars completely.


I think in that same book there was scene about going far fast. Essentially doing that was still huge energy expenditure and done rarely. Just it being done for main character told that it was something Minds considered important.

Culture is bit weird post-scarcity utopia. The space habitats are big and housing in such are plentiful. And basically just given away as toys to placate the fleshbags.


You reminded me that in the culture series a mind can teleport people through "Displacement". It's a very fast, but at the same time I don't think it serves for what OP wants, as it's very risky. It's like planes, we don't hand them spite of being very fast haha

It’s “very risky” in that it kills or fucks people up one in a million times. This is roughly 10x more dangerous than the average car trip in the USA, so, quite dangerous.

As TPOG is basically a scathing social commentary of the global west, the fact that something as low risk as driving to and from work for five days in a row was regarded as so dangerous as to only be undertaken in life or death necessity was not lost on me. Cars are insanity.


I think base case yes, Hub just assigns you something (appropriate, and accommodating your requests as much as possible), but highly desirable locations would be bartered/traded for favors/given to a friend.

I also think that in such a utopia the maximum density would just be a lot lower - no pressure to live near a job or amenities, better transportation, and it'd be easy to move.


In "look to windward", on an orbital, limited tickets for a concert for a famous composer are assigned randomly. A black market forms for those tickets, mostly based on barter as of course they don't have money.

Unless they’d outlawed money, some unit of account and medium of exchange must emerge naturally, even in a post-scarcity society.

Yeah they can, if the want isolation, no internet or water and no friends around them.

> It's called utopia.

Its not. We already see this on social media: creating pictures and clips has become basically effortless and the result isnt utopia, its a massive steaming pile of worthless shit.


Sorry, but its quite the opposite. Its dystopia considering only the already rich really benefit from it.

Unironically, shouldn't the purpose of the human life be turning dystopias into utopias?

Yes. The exact opposite is happening though. We turned from capitalism into late stage capitalism. We feed into the AI hype that's built upon the thievery of creative work while promising corporate leaders to let go off their employees at some point. Not being able to purchase food for your kids is not utopia to me.

Especially combined with the AI companies focusing on the destruction of value of human creative output.

Yea that's a shame really... Creativity was one of the only things that made me enjoy my everyday life. I just do everything offline now. Sucks to not being able to discover new music as easily anymore though.

Is water less valuable because it comes out of a tap almost for no money?

The answer is wholly dependent on where you live and how scarce water is.

Most in the west don't understand water scarcity.


My question can be asked anywhere in the world, as the availability in question is built into it.


Mmm, kind of. Scarcity is definitely fundamental under capitalism. But what do we do in a theoretical, post-scarcity society?

The digitization of information and media combined with the Internet and widespread use of electronic devices practically means that in some important ways, we are already grappling with post-scarcity in certain fields. 600 years ago, "books" and other texts were rare and valuable, then there was an explosive transformation with the invention of the printing press. But while much easier, there was a still a laborious printing process and a copy of a book was still a valuable thing. Now, a "book" can exist as an .epub and be copied perfectly a million times practically for free. It is similarly true for movies, photos, recorded music, news articles, etc.

As a capitalist society, we've really struggled how to deal with this post-scarcity arrangement. We understand in the abstract that this stuff is important, and that creating it is a laborious process, but we do not really know how to assign copies of those works value (because, once created, they immediately become infinitely abundant). The best idea we've seem to have settled on is articifically creating scarcity by locking the digital works behind paywalls and subscription services that require an account, or maybe DRM paired with a EULA. But I think people generally, and the HN crowd specifically, understand that is a lousy arrangement.

Could energy become so abundant that it is also post-scarcity? Between fusion energy and advancements in solar, wind, and geothermal energy, maybe! It is a tantalizing vision to dream of, but what does that look like under capitalism?


I know what you're getting at, but for the Socratic sake of things, I have bad news! :D

Electricity that is too cheap to meter is possible today. I'm pretty sure that we are technologically capable of producing enough solar panels to supply reasonable energy needs (ignoring AI data center nonsense, for now). I think this is happening already in certain countries, but the economics of it get weird, because even as a public utility, you have to charge something. A market that drives prices down to almost nothing will then cease to exist, and powerful people don't want that to happen.

The real solution is that governments should just build out power capacity and provide electricity as a service to its citizens, like healthcare and education. The solution we'll probably get is some Dickensian torment nexus where orphans are pushed into a meat grinder and our electric bills go up.


I think it is a step towards a money free world. Only if we could invent a food printer.

There is almost zero credible evidence I think you could point to that this even vaguely resembles a credible path that we are on in reality. Sometimes theoretical models don’t match reality and this sure seems to be a good example of that.

Without money what system would be used? Barter? Communism? Warlords?

Only if you assume that the only kind of value is the ability to be sold for a price. Marx would have a word about use value vs exchange value.

> Marx would have a word about use value vs exchange value.

Sounds like a semantics trick. Value is value. Sure, something can have a different value if you exchange it versus if you use it. It can also have a different value if you eat it, or drink it, or smash it, or wear it, or gift it to a family member, or gift it to a friend, or gift it to a lover. "Exchange" is simply one way of use.


Interestingly OP's idea that "AI destroys value" seems to come at least partly from the labor theory value, which Marx accepted (as most classical economists).

Unfortunately, the labor theory of value is self-contradictory. If you invent a new machine that replaces human labor, it will clearly produce more value, yet human labour is reduced. So this follows that not all value can be attributed to human labor.

What this really breaks down is meritocracy. If you cannot unambiguously attribute "effort" of each individual (her labor) to produced "value", then such attribution cannot be used as a moral guidance anymore.

So this breaks the right-wing idea that the different incomes are somehow deserved. But this is not new, it's just more pronounced with AI, because the last bastion of meritocracy, human intelligence ("I make more because I'm smarter"), is now falling.

Addendum: Although accounts differ on this, Marx seemed to struggle with LTV, IIRC Steve Keen's Debunking Economics shows Marx contradicting himself on it.


I disagree about innovation in automation creating a contradiction in LTV. LTV states that the exchange-value of goods is determined by the socially necessary amount of labor needed to produce them. Automation only means that the socially necessary amount of labor changes, so the exchange-value changes too.

Also in Marx theories exchange-value is something different than use-value, the latter being unaffected by automation.


Well, but under free market conditions, prices based on use and exchange value should equalize. So the paradox will appear, unless you have a planned economy.

Maybe Marx resolved the tension by converting the contradiction into a (wider) capitalism contradiction, and was happy with that solution. Whether it makes OP happy in the age of AI ("everything is capital and you're screwed if you don't own it"), not sure.


All value is derived from labor.

Marx agrees on this, Adam Smith agrees on this, John Locke agrees on this, heck even Keynes agrees on this; all (sensible) economists agree on this. If you do not have labor somewhere in the process, you do not have value.

“Equal quantities of labor, at all times and places, may be said to be of equal value to the laborer. In his ordinary state of health, strength and spirits; in the ordinary degree of his skill and dexterity, he must always lay down the same portion of his ease, his liberty, and his happiness. The price which he pays must always be the same, whatever may be the quantity of goods which he receives in return for it. Of these, indeed, it may sometimes purchase a greater and sometimes a smaller quantity; but it is their value which varies, not that of the labor which purchases them.” says Smith.

From this Smith concludes that “labor alone, therefore, never varying in its own value, is alone the ultimate and real standard by which the value of all commodities can at all times and places be estimated and compared. It is their real price; money is their nominal price only.”

It's a bit long winded (brief by his standards, though), but the essence is that labor ultimately determines if there is value. But, critically, not the amount of value. Marx would say that the amount of value is the amount of effort that goes into the commodity, something we now know not to be true.

Still, the point here with AI:

If the labor in the products and services that AI produces goes to 0, then the value of those goods and services must also go to 0.

As a brief example, look at chess or F1 racing. There are chessbots that can beat any human, there are F1 robots that can outrace and win against any human. Yet still, we find no value in watching a robot beat up on a human or on another robot. No-one watches or cares to watch those kinds of competitions. We only care to watch other humans compete against each other. There are many reasons for this, but one is that there is labor involved in the competition.


I basically agree all value is derived from labor, but a lot of modern economists do not.

There's are an interesting book called "This Life: Secular Faith and Spiritual Freedom" by Martin Haaglund. Part 2 of the book is really concerned with the Labor Theory of Value, and it articulated it in a way I'd never really understood before. It's hard to summarize in a short post, but here's an essay that engages with the ideas in a span of a few pages: https://www.radicalphilosophy.com/article/the-revival-of-heg...

Really, I encourage people to check out the book. It was at times challenging, and but always thought-provoking. Even when I found myself disagreeing (I have some fundamental disagreements with part 1), it helped me articulate my own worldview in a way that few books have before. It's something special. Anyway, the book really cemented and clarified my views on the labor theory of value.


> All value is derived from labor... all (sensible) economists agree on this

The labor theory of value is one of multiple theories of values [1]. And it is still widely debated.

> Marx would say that the amount of value is the amount of effort that goes into the commodity, something we now know not to be true.

Marx would say the _exchange_ value of a commodity is proportional to the amount of _socially necessary labor time_ required to produce it. Again, something that debatable.

[1] https://en.wikipedia.org/wiki/Value_(economics)#Theories [2] https://en.wikipedia.org/wiki/Criticisms_of_the_labour_theor...


> An idea that has been living rent free in my head is that "AI is ultimately nothing but a pure destruction of value". It's promise is unlimited value to everyone on demand; but if everyone can do everything without any effort, it is no longer valuable. Value and scarcity go hand in hand.

1) I think it's the destruction of our value, as workers. Without an unthinkable change in society, we'll be discarded.

2) I think it will also destroy the unrealized value of not-yet-created work, first by overwhelming everything with a firehouse mediocre slop, then by disincentivizing the development of human talent and skill (because it will be an easy button that removes the incentives to do that). AI will exceed humans primarily by making humans dumber, not by exceeding humans' present-day capabilities. Eventually creative output will settle at some crappy, derivative level without any peaks that rise above that.


There is a very strong argument that if your work output can be discarded effectively in favor of a firehose of mediocre slop, then it is a moral imperative that we stop employing human beings in those roles as it’s a terrible waste of a human life.

The only people I see handwringing over AI slop replacing their jobs are people who produce things of quality on the level of AI slop. Nobody truly creative seems to feel this way.


Video scoring people are feeling it. I think a world with Hanz Zimmer soundtracks, with Tron 2 with a Daft Punk soundtrack, is a richer world than one where soundtracks are machine generated.

Creative people actually do feel this way. There are huge discussions about it going on by actual creative people. Why are you hand waving that away and saying if they are discussing it they must not be adding any value, therefore their discussion is discarded? It's definitely a convenient position for you to take, but it doesn't seem like a real position when objectively great talent are taking the position you say only poor talent would take?


No movie studio would choose AI slop when people like John Williams or Hans Zimmer exist. That’s a ridiculous argument. It’s such a simple way to differentiate and compete. Whatever Williams cost, Lucas made it back 100x.

If AI gets good enough to replace them, then we can have a different discussion - but I don’t think you get truly great art without the full spectrum of human emotion and experience - that is, full AGI. In that case, all jobs are toast and we don’t need to have this discussion.


> No movie studio would choose AI slop when people like John Williams or Hans Zimmer exist.

I wouldn't be so sure. During the writers strike I heard the producers where hoping to replace a lot of their work with AI.

> but I don’t think you get truly great art without the full spectrum of human emotion and experience

The movie industry is in the business of selling tickets, and the TV industry is in the business of getting people to look at ads. Creating "truly great art" is not the priority, but sometimes happens because people are still involved.

Our choices as consumers are constrained. If they all get compromised at the same time, because the producers are following similar incentives, the market won't punish them.


But of course. You would only worry about AI if it will replace your job.

Factory workers didn't worry about cars, but buggy drivers did. Office workers didn't worry about factory automation, but factory workers did.


But don't you know, 60 years later, after WW2, the labor market worked out. After a few minor details happened in between. So things turned out fine for the buggy drivers and you are freaking out just because you are the new buggy driver (things did not in fact work out for buggy drivers, but that's just a small detail glossed over because things worked out for people after WW2, we have no idea how buggy driver's lives turned out in our example of everything working out for buggy drivers).

Things will be fine (it might take another 60 years and a few minor details, but it will all magically work out, like it did then. Not for the buggy drivers though. They were fucked, broken people living on skid row grew so huge it was a popular trope in children's cartoons).


You’re doing that thing that people sometimes do where they say something incredibly naive but do so in such a confident manner that they imply they are really this enlightened individual and it’s everyone else who’s dumb.

But this idea that you’ve put forth really doesn’t hold up to even the lightest bit of scrutiny when you actually start thinking about what this would look like in reality.


But this is just the full-on race to the bottom. Stated simply this philosophy would be "there is only power those to weak to be effective at wielding it".

I think exactly the opposite of you because to me consuming from a firehose of slop is the most terrible way you could waste a human life.


Counting on:

- AI never advancing past a "firehose of mediocre slop"

- Consumers as an aggregate "choosing" quality over cost and availability

is a good way to never worry about AI, yes. But that's not the assumptions this article or thread is written on.


> There is a very strong argument that if your work output can be discarded effectively in favor of a firehose of mediocre slop...

> The only people I see handwringing over AI slop replacing their jobs are people who produce things of quality on the level of AI slop. Nobody truly creative seems to feel this way.

Have you ever worked for an American company? They almost always choose slop over quality. Why should an executive hire employ a skilled American software engineer, when he can fire him and hire three offshore engineers who don't really know what they're doing for half the price? Things won't blow up immediately, there's a chance they'll just limp along in a degraded state, and by then executive will be off somewhere else with a bonus in his pocket.

Also, how many people are "truly creative" and how does that compare to the number of people who have to eat?

> then it is a moral imperative that we stop employing human beings in those roles as it’s a terrible waste of a human life.

And what should they do then? Sit around jerking off under a bridge?

There's no "moral imperative" to cast people off into poverty. And that's what will happen: there will be no retraining, no effort to find roles for the displaced people. They'll just be discarded. That's a "terrible waste of a human life."


The business sociopaths need to be able to shit out code and logos and voices and ads for their SaaS crypto casino social media feed fake therapy dating gig worker app, and AI makes it much cheaper to do so. The social value is there, AI lets us do more with less but it's always captured by the few people on top.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: