hacker-news-custom-logo

Hackr News App

30 comments

  • bob1029

     

    13 hours ago

    next

    [ - ]

    > Secondly, we have got machines equipped with multi-level stores, presenting us problems of management strategy that, in spite of the extensive literature on the subject, still remain rather elusive.

    NUMA only got more complicated over time. The range of latency differences is more extreme than ever. We've got L1 running at nanosecond delay, and on the other end we've got cold tapes that can take a whole day to load. Which kind of memory/compute to use in a heterogeneous system (cpu/gpu) is also something that can be difficult to figure out. Multi core is likely the most devastating dragon to arrive since this article was written.

    Premature optimization might be evil, but it's the only way to efficiently align the software with the memory architecture. E.g., in a Unity application, rewriting from game objects to ECS is basically like starting over.

    If you could only focus on one aspect, I would keep the average temperature of L1 in mind constantly. If you can keep it semi-warm, nothing else really matters. There are very few problems that a modern CPU can't chew through ~instantly assuming the working set is in L1 and there is no contention with other threads.

    This is the same thinking that drives some of us to use SQLite over hosted SQL providers. Thinking in terms of not just information, but the latency domain of the information, is what can unlock those bananas 1000x+ speed ups.

    reply
  • varjag

     

    5 hours ago

    prev

    next

    [ - ]

    Such an evergreen observation:

    Nowadays one often encounters the opinion that in the sixties programming has been an overpaid profession, and that in the coming years programmer salaries may be expected to go down. Usually this opinion is expressed in connection with the recession, but it could be a symptom of something different and quite healthy, viz. that perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products.

    reply
  • saghm

     

    8 hours ago

    prev

    next

    [ - ]

    > A study of program structure had revealed that programs —even alternative programs for the same task and with the same mathematical content— can differ tremendously in their intellectual manageability. A number of rules have been discovered, violation of which will either seriously impair or totally destroy the intellectual manageability of the program. These rules are of two kinds. Those of the first kind are easily imposed mechanically, viz. by a suitably chosen programming language. Examples are the exclusion of goto-statements and of procedures with more than one output parameter. For those of the second kind I at least —but that may be due to lack of competence on my side— see no way of imposing them mechanically, as it seems to need some sort of automatic theorem prover for which I have no existence proof. Therefore, for the time being and perhaps forever, the rules of the second kind present themselves as elements of discipline required from the programmer. Some of the rules I have in mind are so clear that they can be taught and that there never needs to be an argument as to whether a given program violates them or not. Examples are the requirements that no loop should be written down without providing a proof for termination nor without stating the relation whose invariance will not be destroyed by the execution of the repeatable statement.

    Interestingly, designing a language that enforces that loops need an invariant that proves they terminate is actually possible; Coq, for example, does pretty much exactly this from what I understand. My understanding is that this means that it isn't Turing complete, but I also think that maybe Turing completeness isn't quite as necessary for as many things as it might otherwise seem like.

    reply

    jmj

     

    6 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@saghm> Invariant is the property that is preserved on every iteration. Proof of termination in imperative languages can be done by proving that a natural number decreases with every step.

    Dafny implements this at the compiler level (and a curly braces syntax!).

    Coq uses other methods more tailored towards recursion.

    You are right that if every loop must terminate, it is not Turing complete. So some valid programs will not compile.

    There are some interesting programs that potentially never terminate (like servers, daemons, OS, games, etc) formal methods can be applied too. For instance to prove that they preserve certain property or that they don't terminate or terminate only under certain conditions.

    I find the theory extremely elegant and pleasurable, but it´s obviously not everyone's cup of tea, as shown by its lack of widespread use.

    LLM's might create a revival in the coming years for the following reasons:

    1) cost of formalization goes down 2) cost of proving goes down 3) cost of programming goes down 4) provable code quality becomes a differentiation among a sea of programs

    reply
  • ddtaylor

     

    8 hours ago

    prev

    next

    [ - ]

    What computer is he referring to?

    > When these machines were announced and their functional specifications became known, quite a few among us must have become quite miserable; at least I was. It was only reasonable to expect that such machines would flood the computing community, and it was therefore all the more important that their design should be as sound as possible. But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. Perhaps the most saddening thing now is that, even after all those years of frustrating experience, still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it.

    reply

    ddtaylor

     

    8 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@ddtaylor> Apparently the IBM 360 and by an extension OS/360, which apparently was so "problematic" that it inspired the book The Mythical Man-Month. Neat.

    reply

    vincent-manis

     

    2 hours ago

    root

    parent

    next

    [ - ]

    [ x ]

    <@ddtaylor> OS/360 was a victim of over-ambition. Originally, it was to run on all models of System/360 (maybe not the Model 20, which implemented only a subset of the ISA), and it was complex enough that there were challenges in building it. IBM ended up spinning off the smaller models to IBM Germany, DOS/360 came from there, and subsetting OS/360 into 3 levels, so they could get something out the door only a bit late. OS/360 by some measures is one of the most successful operating systems ever: IBM's current z/OS is a remote descendent of it. Yes, it is pretty horrible to use (three letters: JCL), but it certainly was successful.

    “The Mythical Man-Month” is not about OS/360 as such, but about project planning and specifically what was learned about project management during the development.

    reply
  • Michelangelo11

     

    8 hours ago

    prev

    next

    [ - ]

    > The first effect of teaching a methodology —rather than disseminating knowledge— is that of enhancing the capacities of the already capable, thus magnifying the difference in intelligence.

    Absolutely right, with the implication that new capabilities available suddenly to everyone often end up making the playing field more unequal, not less.

    reply

    stillpointlab

     

    7 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@Michelangelo11> I feel we are seeing this now with the adoption of coding agents.

    reply

    koakuma-chan

     

    7 hours ago

    root

    parent

    next

    [ - ]

    [ x ]

    <@stillpointlab> Don't tell them!

    reply
  • stereolambda

     

    12 hours ago

    prev

    next

    [ - ]

    In the articles and talks from that time people often take the perspective of what the whole society (with its organizations) wants from the "automatic computers" and programmers as a profession. Compare also something like the 1982 Grace Hopper's talk on YT. Now I think it's mostly the perspective of companies, teams, the industry. This shift happened in the 1990s? I'm guessing here.

    I guess there is still something left here from there from the concept of programming language as a tool for top-down shaping and guiding the thinking of its users. Pascal being the classic example. Golang tries to be like that. I get how annoying it can be. I don't know how JS/TypeScript constructs evolve, but I suspect this is more Fortran-style committee planning than trying to "enlighten" people into doing the "right" things. Happy to be corrected on this.

    Maybe the hardest to interpret in hindsight is the point that in the sixties programming has been an overpaid profession, the hardware costs will be dropping and software costs cannot stay the same (You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively). Yeah, in some sense, what paying for software even is anymore.

    But interestingly, the situation now is kind of similar to the very old days: bunch of mainframe ("cloud") owners paying programmers to program and manage their machines. And maybe the effectiveness really has gone up dramatically. There's relatively little software running in comparison to the crazy volume of metal machines, even though the programmers for that scale are still paid a lot. It's not like you get a team of 10 guys for programming each individual server.

    reply
  • Nicook

     

    6 hours ago

    prev

    next

    [ - ]

    Java really needs to take a look into the

    >baroque monstrosity

    warnings. probably beating a dead horse here, but way too many tools, and they keep adding more.

    reply

    TremendousJudge

     

    4 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@Nicook> hah, when I read that part, I immediately thought of C++. But I guess all the bigcorp languages suffer from that same issue.

    reply
  • puttycat

     

    10 hours ago

    prev

    next

    [ - ]

    What a joy to find a plaintext HTML page (and such a wonderful text of course).

    reply
  • selcuka

     

    16 hours ago

    prev

    next

    [ - ]

    > The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use.

    Apparently the ISO/IEC 1539-1:2023 [1] committee didn't get the memo.

    [1] https://www.iso.org/standard/82170.html

    reply

    pjmlp

     

    16 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@selcuka> Modern Fortran is quite neat, and much better than having to deal with Python + rewriting code into C and C++.

    reply
  • enord

     

    16 hours ago

    prev

    next

    [ - ]

    It’s a real shame Dijkstra rubbed so many people the wrong way.

    Maybe his incisive polemic, which I greatly enjoy, was all but pandering to a certain elitist sensibility in the end.

    To make manageable programs, you have to trade off execution speed both on the cpu and in the organization. His rather mathematized prescriptions imply we should hire quarrelsome academics such as him to reduce performance and slow down product development[initially…] all in the interest of his stratified sensibilities of elegance and simplicity.

    Sucks to be right when that’s the truth.

    reply
  • b0a04gl

     

    10 hours ago

    prev

    next

    [ - ]

    dijkstra's take aged better than most things from that era. still see teams chasing fast output over clean design and hitting walls later. the mind-map linked in the thread does a decent job condensing it. worth a skim even if you’ve read the essay before

    reply
  • ddtaylor

     

    8 hours ago

    prev

    next

    [ - ]

    > In this sense the electronic industry has not solved a single problem, it has only created them, it has created the problem of using its products.

    Oh boy does that read VERY true today!

    reply

    AnimalMuppet

     

    5 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@ddtaylor> Not to me. The electronic industry has created new problems, but it has solved old ones.

    I'm old enough to remember what text editing was like before word processors. I'm old enough to remember trying to reach people before cell phones. I'm old enough to remember trying to find information in a physical library. There's a lot of problems that electronics has solved.

    reply
  • gobblik

     

    12 hours ago

    prev

    next

    [ - ]

    Or, for the esolangers: The Less Humble Programmer http://digitalhumanities.org/dhq/vol/17/2/000698/000698.html

    reply

    xpointer

     

    4 hours ago

    parent

    next

    [ - ]

    [ x ]

    <@gobblik> More specifically, The Humble Programmer is about "professionalizing" programming. In the '50s and '60s, programmers justified clever tricks due to the strict constraints of early machines. Dijsktra is saying enough already with that, we need to move to a neutral style and favor clarity above all else, so programmers can understand others' work. Esolangs, which often annihilate readability, give an excuse to show off technical feats that aren't justified in mainstream code, a return to the "Wild West" (as Backus put it) or early computing.

    reply
  • jsonchao

     

    15 hours ago

    prev

    next

    [ - ]

    This is what I thought~

    reply
  • dkarl

     

    8 hours ago

    prev

    [ - ]

    > But if you take as “performance” the duty cycle of the machine’s various components, little will prevent you from ending up with a design in which the major part of your performance goal is reached by internal housekeeping activities of doubtful necessity

    JITs have taken this to an even higher level — people don't just argue that the machine is fast enough to run their convoluted code with countless unnecessary layers, they argue that their code as they've written it won't be run at all: the JIT will reduce it to a simpler form that can be handled efficiently.

    But they can't explain why their poor coworkers who have to read and maintain the code don't deserve the same consideration as the machine!

    reply

    nradov

     

    8 hours ago

    parent

    [ - ]

    [ x ]

    <@dkarl> I don't understand your comment. A good JIT compiler can often make a program more efficient by taking advantage of runtime profiling. This allows developers to write simpler, more maintainable code without doing tricky things for efficiency.

    reply

    dkarl

     

    8 hours ago

    root

    parent

    [ - ]

    [ x ]

    <@nradov> That's the upside of JITs and a great way to take advantage of them. Unfortunately, not every programmer is motivated to produce simple code. Some programmers prefer to write more complex code, either because they enjoy building castles in their mind, or because they would rather not take the time to remove any of the dead ends and missteps they made while searching for a solution.

    Highly optimized code being convoluted is an extreme case, for rare algorithms or exotic levels of instruction-level efficiency. The first 95% of optimization is simplifying the code, which benefits both the machine and the programmers.

    reply

    nradov

     

    6 hours ago

    root

    parent

    [ - ]

    [ x ]

    <@dkarl> We usually catch that type of problem in code review.

    reply