Hacker Newsnew | past | comments | ask | show | jobs | submit | aportnoy's commentslogin

For NVIDIA,

1. play around with the NVPTX LLVM backend and/or try compiling CUDA with Clang,

2. get familiar with the PTX ISA,

3. play around with ptxas + nvdisasm.


> But the lack of ECC was a huge bummer at the time of purchasing my system.

Why?..


Ever had to troubleshoot bit flips on a non-ECC system? One friend felt like he was going crazy as over the course of two months his system degraded from occasional random errors to random crashes, blue screens and finally to no POST. Another time, a coworker had to stare at raw bytestreams in Wireshark for hours to find a consistently flipped bit.


Don't overclock your memory.


All of these were with stock, non XMP clocks.


Well then… test your memory :)


How often do you test your memory? The nice thing about ECC is it's always testing your memory, and (if it's set up properly!) you'll get notified when it begins to fail. Without ECC, your memory may begin to fail, and you'll have to deal with the consequences between when it starts to fail and when you detect it.

(Of course, I don't run ECC on my personal systems, but at least I'm wandering knowingly into the abyss)


Testing your memory detects if you have bad RAM, which ECC isn't going to help with anyway. Perfectly fine memory will experience random bit flips from environmental factors. Your PC components and UPS also degrade over time and can cause random bit flips. ECC is there to catch problems as they happen and ideally take corrective action before bad data propagates


> Ever had to troubleshoot but flips on a non-ECC system?

No.

> One friend felt like he was going crazy

Tell him about memtest86.


Wow I came back to post this exact reply. I set my system to a slightly high frequency, ran memtest overnight with errors.

Set it back down to a supported frequency, ran a full memtest suite again with no errors.

Never had any issues since.


> Wow I came back to post this exact reply. I set my system to a slightly high frequency, ran memtest overnight with errors. > > Set it back down to a supported frequency, ran a full memtest suite again with no errors.

Cool. You tested your memory at some point in the past.

How do you know it's still working properly and hasn't flipped any bits?

You don't. Because you have no practical way of testing the integrity of the data without running an intrusive tool like memtest86 that basically monopolizes the use of the computer.

Being able to detect these types of memory errors at a hardware level while the processor is doing other things is the fundamental capability that ECC gives you that you otherwise wouldn't have, no matter how thoroughly you run memtest86.


You likely wouldn't know if you had random bit flips. It'd manifest as silent data corruption. You might be okay with that. Others aren't.

It's not a matter of overclocking. Bit flips are a fact of life running with 32+ GB RAM. Leaving your machine on 24/7 (even if in sleep) stacks the odds against you.


Obviously this is just anecdote but I have a work laptop with 128GB of non-ECC ram , use all of it every day and never noticed any issues. I'm not saying there aren't any, but it just....works.


You have silent bit flips, they silently corrupted data instead of causing a visible error.


> extremely opinionated

I have not seen a single codebase that widely uses uint8_t and does not typedef it to u8. It is the exact opposite of "extremely opinionated".


Note Sam Altman is a Cerebras investor.


The first valuable comment imho, thanks!

Let me add : there are numerous other AI chip startups.


From the last thread on this (https://news.ycombinator.com/item?id=32610780):

- https://sambanova.ai/ (Enterprise AI and dataflow-as-a-service for established models)

- https://www.cerebras.net/ (AI accelerator, trying to compete with Nvidia)

- https://www.graphcore.ai/ (Another AI accelerator company, UK based)

- https://femtosense.ai/ (Sparse NNs on very low power chips, cool hardware and software challenges)

- https://sima.ai/ (ML accelerators for embedded applications)

- https://ambiq.com/ (Not AI, but low power chips for wireless using some fancy tech that reduces energy leakage)

- https://www.esperanto.ai/ (RISC-V based Tensor computes chip, founded by Intel Hybrid Parallel Computing Vice President Dave Ditzel)

- https://www.furiosa.ai/ (AI accelerator company which show good results in MLPerf benchmark)

- https://groq.com/ (From the team that built the original TPU at Google)

- https://lightmatter.co/ (Light tubes instead of copper)

- https://www.untether.ai/


Exactly. We are now starting to slowly realize... [0]

[0] https://news.ycombinator.com/item?id=35490837


Ah, so that is how Sam and Co. cash out on the 10 billion from MS!


What happened?


Nothing happened. It was an informal technical interview with the program manager at JAX. 1 hour call and the interview was remote but describing him as opinionated and entitled is an understatement. Best of luck to them.


Asking because I am on that team :)

I went through the same hiring process and had a positive experience at every stage. I had a strong competing offer but went with the JAX team at NVIDIA.

I'll pass it along as feedback.


Would you mind sharing some details? It sounds like an interesting peek behind the curtain.


1 is the multiplicative identity

0 is the additive identity

all([]) is True

any([]) is False


Go to the blog and skip to results: https://ai.meta.com/blog/seamless-m4t/


For these tasks and languages, SeamlessM4T achieves state-of-the-art results for nearly 100 languages and multitask support across automatic speech recognition, speech-to-text, speech-to-speech, text-to-speech, and text-to-text translation—all in a single model. We also significantly improve performance for low and mid-resource languages supported and maintain strong performance on high-resource languages.

To more accurately evaluate the system without depending on text-based metrics, we extended our text-less metric into BLASER 2.0, which now enables evaluation across speech and text units with similar accuracy compared to its predecessor. When tested for robustness, our system performs better against background noises and speaker variations in speech-to-text tasks (average improvements of 37% and 48%, respectively) compared to the current state-of-the-art model.

SeamlessM4T also outperforms previous state-of-the-art competitors.


The project is a port of https://github.com/nviennot/core-to-core-latency from Rust to C.


I used to surf near Scripps Pier while in college and I remember there were always a couple people sitting outside in beach chairs working on their laptops.

Salk Institute is an even more surreal place.


The funny thing is Perl is now arguably more obsolete than sed and awk.


My livelihood depends on a large set of Perl scripts, and I bet I'm not the only one in that position. It's perfect for gluing things together and then not breaking for decades, while other languages come and go. It's kind of the new COBOL that way.


I’d love to eventually settle into retirement by maintaining some unloved Perl.


Recently I wanted to see if some code I wrote still worked on Java 1.x.

The only way I could get Java 1.x to run (among my fleet of existing machines) was by installing the windows JDK version to run inside Wine on my Ubuntu.

Native Windows 10 could not run Java 1.x. This surprised me, since in my previous experience Windows was pretty amazing for maintaining reverse compatibility.

(I guess I could grab a super-old Debian docker image like Debian 6 and see if the linux Java 1.x binary would run on that, but I wanted to stick with machines I already had running.)


I make money off "old" systems; I'm spending time with Perl too. Lots of legacy systems out there.


Which languages are most common in these systems?


Well, I'm not sure about everywhere but, the work I'm finding is ASP3.0 (IIS6.0) and VB6 (more rare) also Perl and PHP. Many of these projects were started in the late 90s. They are internal systems (billing automation, internal work-tracking, custom "ERP"). Nothing exciting, just old shit that makes money in a non-tech business. I find the work on LI, and I show up when folk search for these olds skills. But I know a few others in this "maintenance" space. One just does legacy Java, one just does old Windows (NT4, 2000), etc. Generally we find work from each other.

Additionally, I think it's hard to enter the "Legacy" space. You have to un-learn some patterns, lots of: oh yea, we used to do this the hard way. The other bump is the documentation - the 1999 docs are buried under the 2009 docs which are decaying under the 2019 docs. The thing called ActiveRecord for example means like 99 things.


Do you find any of these legacy system clients are open to having their system upgraded to a modern version of the same or related language, eg. .Net for ASP? After all, it wouldn't involve a significant change in the business logic and most esablished languages are more or less backwards compatible.


I hear that. I want that. But, mostly what I see is that at least once before a vendor shit the bed on a rewrite in $ModernX and then the business play became "nurse it".

And, having been through many rewrites myself, I frequently surprised they could hose the deal. It feels like scope-creep is the killer but I only have feelings for the data, no metrics.


join(“ “, @{$ounds->{really}{&fun}}) =~ /^[a]n+d\sgr8 4 (you(r|se) eye$/g


Modern day ronin right here.


That might be true culturally, but it's not true technically. Once you know a bit of Perl (including some command line switches like -e, -n, -a, -l, -p, and -i) you will be able to do just as easily everything you used to do in awk and sed, but with a more powerful foundation so you have the flexibility to go further.


> but with a more powerful foundation so you have the flexibility to go further.

The question is: should you?

I like Perl, but it itself has many warts that make maintaining a large codebase more of a nightmare than using C, and certainly more than most modern languages.

I agree with other commenters here that the tools Perl sought to replace are still used more than it. It has its niche of being excellent at text processing, and more capable than shell scripts at that task, but I'd think twice about reaching for it to build anything more complex than a shell script replacement. Especially in 2023.


What's a good resource to be proficient in perl for shell scripting?

I write a lot of bash for better or worse. I wonder if perl would be a better choice sometimes.


If you can't manage it with Shell scripting you should take a look at Python.

I've written Perl for many years but in the end switched to Python because the internet and resources you can find are much more quantitative and qualitative than those for Perl. Especially now when Google doesn't seem to return older results anymore.


I'd rather write perl tho


Worse Is Better syndrome.


"Minimal Perl" by Tim Maher is unrivaled for this.


Thank you for mentioning that book - I'm very interested in how perl can replace awk and sed and even grep(!) - I use grep multiple times a day, awk maybe a few times a week, and sed very rarely. I think I'm in the target demographic for this :)


The book constains Perl one-liner-style equivalents of these UNIX utilities which you can just drop into your bin directory.


Cool, I'm in the market for that.


For Perl-style grep there's also ack and it's successor ag which you might want to look at first.


I use Perl all of the time. If I am going to write a bash script that is more than 12 lines long I use Perl or if I need to do read text files and do regex matching things I haven't found anything better.


It's still used heavily for internal processes and internal web sites that query databases etc. At least at some of the large publicly traded companies I know about...


My first thought when I saw this was that I'd like to see some stats on use of the three. I personally use sed and awk regularly and never perl, but I have no idea what is typical. I have a feeling that most people (amongst those who may have a use case) don't use any of them.


Ditto - I use sed/awk daily, but I never write Perl any more (though I do use older bash scripts that call Perl under the hood).

I'm still comfortable recommending people learn sed/awk but I don't think I'd recommend learning Perl* now. It sits in an awkward spot between the simplicity of GNU utils and the expressivity of a scripting language, but doesn't do either of those things better than the equivalent sed/Python etc.

FWIW I still write Bash scripts on a daily basis, too.

* that's no comment on Raku, which I consider a separate beast.


> simplicity of GNU utils and the expressivity of a scripting language, but doesn't do either of those things better than the equivalent sed/Python etc.

It does scripting and gluing scripts far better than python.

It's not even close.

I'm not even going to address sed, awk, bash and the like.


Just imagine writing bash incorporating a sed script for something you can do with 20 keystrokes, then migrating the sed part to awk until it reaches 20 lines, then rewrite the whole thing to Python or somehow bridge it to the shell script - when you could have done the whole thing in Perl without feeling that it holds you back...


> without feeling that it holds you back

This. I always feel like bash and pipes are always holding me back and resisting.

Python is just to verbose to exec/pipe/read/write.

The real issue is the it's really easy to shoot yourself in the foot with perl.


if you have a script that makes heavy use of sed, awk, tr and the like, then maybe translating to perl would be most natural.

but otherwise a script is either just calling a lot of commands one after the other, for which a shell script is fine. or they do a lot of data mangling without needing many external applications or none even, in which case any other languages besides perl is just as fine.


Yep I responded in another comment but I use sed and awk almost daily but I never learned Perl. It's actually something that could be useful in my current role though -- is there a particularly good intro to Perl in 2023 that I might want to look at?


The Perl man pages are really good! Start with "man perlintro" and then look at whatever sections in the "man perl" table of contents fit your needs.

A lot of the tutorials online in my opinion are overbaked and written for people writing large OO projects using a lot of scaffolding. But the man pages were originally targeted at sed and awk users in your position.


Sounds perfect, thanks!


The wonderful thing about Perl is the docs from 2013 are still valid. All the growth/rapid-change/package-management issues that newer ecosystem suffer through Perl did in like the late 90s. 20 year old code still works (mostly). Not nearly the churn (anymore) as JS or PHP or Go or Rust.


Google Trends has sed crossing over perl about 5 years ago. Awk seems insignificant, which matches my expectation. (I’ve used sed but never awk, I don’t even know what it does)

Worth noting “sed” is “thirst” in Spanish, which has the potential to throw off the data, especially worldwide.

https://trends.google.com/trends/explore?date=all&geo=US&q=p...


awk is a fantastic command line application that is useful for filtering, basic computations and transformations that other basic utilities (e.g. cut, sort, etc.) + grep don't (readily/easily) provide. I suppose it might be a bit archaic these days, but for someone familiar with it, it's often faster to compose an awk command than to write even a very quick/short script in, e.g., python or perl, for certain things.


>awk is a fantastic command line application that is useful for filtering, basic computations and transformations that other basic utilities (e.g. cut, sort, etc.) + grep don't (readily/easily) provide.

All true.

But awk is also a programming language, not just a command line application or utility.

It has conditionals, loops, regexes, file handling, string handling, (limited) user-definable functions, hashes (associative arrays), reporting abilities and more.

In fact, the name of the original book about awk is The Awk Programming Language.


> In fact, the name of the original book about awk is The Awk Programming Language.

Great book. I read it after it was so enthusiastically endorsed here on HN. A lot of people said it was worth reading just to take in the excellent technical writing style of Brian Kernighan.

I would offer a strong second to that.


Cool. Yes, his style is good in his other books too, like K&R (C), K&P (Unix), and GOPL (Go), and The Practice of Programming.


it's not about stats but simply that the space between simple shell scripts and complex systems that required "big" language, which used to be filled by perl has shrunk a lot because it is filled with many other languages that are better suited for larger systems than perl is. for most uses of perl today there are a dozen other suitable languages, including some that are older than perl but were not freely available at the time.

stats wise i am sure there are still a lot of perl users. it has its fans, and there is a lot of existing systems, but it is now a choice among many, and not usually the best choice.


>for most uses of perl today there are a dozen other suitable languages, including some that are older than perl but were not freely available at the time.

Can you name some? I'm guessing Rexx is one of those you mean.


i am talking about the popular classics that have also been mentioned in other comments here, python, ruby, php, and several others that came after them.

with the older ones i mean languages like lisp and smalltalk which didn't become available for free or even usable on PCs until the 90s

i am not saying that they are all replacing perl, but that perl was used in areas where they are better suited, and with their appearance perl is no longer needed in those areas, hence the usage space for perl shrunk a lot.


Unix was always about tools doing one thing. Obviously awk and sed didn’t do one thing only, but Perl broke away from that entirely. Perl tried to replace awk, sed, and the shell all at once.

The problem with Perl is it was still tied to Unix culture to compete with Python and it’s strong library set, so it was pretty much fazed out as an awkward intermediary.


Doesn't booking.com still run on it?


Yes. Actually a lot of companies still run on it. Booking.com, Fastly, ByteMark, OpenCage, I even know of a few startups that run on Perl.


And cPanel (used by many web hosts)?


Thats a name i haven’t heard in years


Yes, though not exclusively anymore.


There's a difference between using something because you want to and using something because you have to. awk and sed are sometimes the right tool, just not always.

When I used sed and awk a lot I also made heavy use of bash which makes things a lot nicer. And Python, of course.


I use Perl scripts dozens of times a day but I rarely write new code in it. awk and sed I use less often but it’s almost always ad hoc new scripts.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: