Right, but if it's noticeably hotter than the environment, then that temperature difference could be used to drive a heat engine and get some more useful work. So the knee-jerk response "omg, we see the heat from space? it's gotta be wasteful!" is kind of correct, in theory.
Some people are saying "waste heat" in the technical sense of "the heat my industrial process created and I need to get rid of" and others are saying "waste heat" as "heat humans are emitting into space without slapping at least one Carnot engine on it yet".
If the heat being generated were economically worthwhile, the miners would be incentivized to use it to offset their costs. Since they aren't, we can somewhat reasonably assume that it would cost more to recapture than it's probably worth.
Is it the same as flipping every parenthesis to the other side of the number it's adjacent to, and then adding enough parentheses at the start and end?
For example,
(1 + 2) * (3 + 4)
becomes
1) + (2 * 3) + (4
and then we add the missing parentheses and it becomes
(1) + (2 * 3) + (4)
which seems to achieve a similar goal and is pretty unambiguous.
I guess from the inside it feels different: I'll read 99 mind-numbingly bad comments and cut them all slack (in the sense of not replying to them at all), but these 99 instances of benevolence are invisible and count for nothing, because the 100th comment will make me fly in a rage and that's when I'll actually post something. And unload a bunch of my frustrations from the previous 99, too. The internet selects for extreme reactions.
Altruist? DARPA is a military agency, ARPANET was a prototype network designed to survive a nuclear strike. I think the grandparent comment's point is that the innovation was government-funded and made available openly; none of which depends on the slightest on its being altruist.
> The CYCLADES network was the first to make the hosts responsible for the reliable delivery of data, rather than this being a centralized service of the network itself. Datagrams were exchanged on the network using transport protocols that do not guarantee reliable delivery, but only attempt best-effort [..] The experience with these concepts led to the design of key features of the Internet Protocol in the ARPANET project
Keeping with the theme of the thread, CYCLADES was destroyed because of greed:
> Data transmission was a state monopoly in France at the time, and IRIA needed a special dispensation to run the CYCLADES network. The PTT did not agree to funding by the government of a competitor to their Transpac network, and insisted that the permission and funding be rescinded. By 1981, Cyclades was forced to shut down.
> Rumors had persisted for years that the ARPANET had been built to protect national security in the face of a nuclear attack. It was a myth that had gone unchallenged long enough to become widely accepted as fact.
No, the Internet (inclusive of ARPANET, NSFNet, and so on) was not designed to survive a nuclear war. It's the worst kind of myth: One you can cite legitimate sources for, because it's been repeated long enough even semi-experts believe it.
The ARPANET was made to help researchers and to justify the cost of a mainframe computer:
> It's understandable how it could spread. Military communications during Nuclear War makes a more memorable story than designing a way to remote access what would become the first massively parallel computer, the ILLIAC IV. The funding and motivation for building ARPANET was partially to get this computer, once built, to be "online" in order to justify the cost of building it. This way more scientists could use the expensive machine.
That's a valiant attempt at myth-fighting, but it doesn't fully convince me. For example, one hop to Wikipedia gives this:
> Later, in the 1970s, ARPA did emphasize the goal of "command and control". According to Stephen J. Lukasik, who was deputy director (1967–1970) and Director of DARPA (1970–1975):
> "The goal was to exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making."
> That's a valiant attempt at myth-fighting, but it doesn't fully convince me. For example, one hop to Wikipedia gives this:
And in that same Wikipedia section there are 3-4 other people, including Herzfeld, who was the guy/director who authorized the actual starting of the project, who say otherwise:
Meanwhile you cherrypick 1-2 paragraphs at the end, while there are over a dozen that say the opposite. Note that Kukasik was later the director of DARPA, which had a completely different mandate that (no-D) ARPA.
Not to mention the functions are also translated to the other language. I think both these are the fault of Excel to be honest. I had this problem long before Google came around.
And it's really irritating when you have the computer read something out to you that contains numbers. 53.1 km reads like you expect but 53,1 km becomes "fifty-three (long pause) one kilometer".
> Not to mention the functions are also translated to the other language.
This makes a lot of sense when you recognize that Excel formulas, unlike proper programming languages, aren't necessarily written by people with a sufficient grasp of the English language, especially when it comes to more abstract mathematical concepts, which aren't taught in secondary English language classes at school, but it in their native language mathematics classes.
The behaviour predates Google Sheets and likely comes from Excel (whose behavior Sheets emulate/reverse engineer in many places). And I wouldn't be surprised if Excel got it from Lotus.
Not sure if this still is the case, but Excel used to fail to open CSV files correctly if the locale used another list separator than ',' – for example ';'.
Sometimes you double click and it opens everything just fine and silently corrupts and changes and drops data without warning or notification and gives you no way to prevent it.
The day I found that Intellij has a built in CSV tabular editor and viewer was the best day.
Given that world is about evenly split on the decimal separator [0] (and correspondingly on the thousands grouping separator), it’s hard to avoid. You could standardize on “;” as the argument separator, but “1,000” would still remain ambiguous.
aha, in Microsoft Excel they translate even the shortcuts. The Brazilian version Ctrl-s is "Underline" instead of "Save". Every sheet of mine ends with a lot of underlined cells :-)
I spent some time with Apps Script a few weeks ago. It has some strange design decisions:
1) Everything runs on the server, including triggers and even custom functions! This means every script call requires a roundtrip, every cell using a custom function requires a roundtrip on each change, and it feels much slower than the rest of the UI.
2) You can't put a change trigger on a cell or subset of cells, only on the whole sheet. So you have to manually check which cell the trigger happened on.
3) Reading and writing cell values is so slow (can be a second or more per read or write) that the semi-official guidance is to do all reads in a bunch, then all writes in a bunch. And it's still slow then.
4) A lot of functionality, like adding custom menus, silently doesn't work on mobile. If your client wants to use Sheets on mobile, get ready to use silly workarounds, like using checkboxes as buttons to trigger scripts and hoping the user doesn't delete them.
Overall I got the feeling that Google never tried to "self host" any functionality of core Sheets using Apps Script. If they tried, it'd be much faster and more complete.
> 2) You can't put a change trigger on a cell or subset of cells, only on the whole sheet. So you have to manually check which cell the trigger happened on.
This is true of MS Excel's scripting language (VBA) as well. Worksheets are objects with events; cells are objects without (VBA-accessible) events.
But Google Sheets remote procedure calls are vastly slower than local OLE/COM dispatching. (And VBA/Excel presumably uses the optimized tighter COM interface binding instead of the slower high level COM IDispatch. Sure there's some overhead but it's nothing compared to Google Sheet's network overhead.)
Not only is scripting Google Sheets indeterminently and syrupy slow, it also imposes an arbitrary limit on how long your code can run, making a lot of applications not just inefficient but impossible. Running your code in google's cloud doesn't make spreadsheet api calls any faster, it just limits how long you can run, them BAM!
To get anything non-trivial done, you have to use getSheetValues and ranges to read and write big batches of values as 2d arrays.
It's easier to just download the entire spreadsheet csv or layers and bang on that from whatever language you want, instead of trying to use google hosted spreadsheet scripts.
> Everything runs on the server, including triggers
I think that’s a consequence of the fact that multiple users can simultaneously edit a sheet. Yes, Google could special-case the “you are the single user of this sheet” case, but that’s extra work, and, I think, would be fairly complicated when handling edge cases where users frequently start and stop editing a sheet.
> I think that’s a consequence of the fact that multiple users can simultaneously edit a sheet.
No, it's not. Built-in functions like SUM recalculate instantly, and custom formatting rules (e.g. "color green if above zero") get applied instantly, even when there are multiple users editing a sheet. Running custom functions and triggers on the server is just a decision they made.
This reason doesn't make much sense to me. Let's say I write a non-idempotent custom function. It makes the spreadsheet behave weirdly: recalculating a cell twice leads to a different effect than recalculating it once. Does it matter whether the function runs on the server or the client? No, the spreadsheet will behave weirdly in either case, even with just one user.
Can we make a programming language that will save developers from that? Maybe, but that would be very hard and that's not what Apps Script is trying to do. It already allows non-idempotence, trusting developers to write idempotent code when they need to. So it could run on the client just fine.
it's so half-assed, why just tack a zero onto the Christian year? in the Yoruba calendar it's 10,067 -- use that and it puts things in a real perspective. We're ten thousand years from the beginning of civilization, not two thousand. Now THAT gives some perspective on the "long now"
Maybe more like twelve thousand years from the beginning: https://en.wikipedia.org/wiki/G%C3%B6bekli_Tepe. The real crucial question is whether we're two years from the end of civilization or two trillion.
reply