My mistake, I was thinking of the wider ecosystem not the runtime, ie formatters, bundles and linters like Biome, oxc, etc being written in Rust or other compiled languages. That's where I saw the biggest speedup, because developers of them decided to use a compiled language to write them in instead of JS via a JS runtime where you'll inherently be limited by even a JIT language.
Personally the lack of a decimal type make SQLite a no-go for me. Its too much of a hassle to do financial operations completely on the application side.
Do you only build stuff that performs financial operations? I'm not sure why sqlite being suboptimal for this one very specific use-case means that sqlite as a whole is a no-go for you.
No, not at all actually. But I had a lot of applications where at somepoint we needed to add storage of currency in the database. It became relevant for billing or other features. This without the app itself being financial in nature.
This and DATETIME. The workarounds for these are mostly fine if you’re just using SQLite as a key value store, but doing translations in SQL queries for these workarounds sounds atrocious.
90% of people aren't using "SQL" anyway. They are just doing basic CRUD operations on a data store with SQL syntax, usually abstracted through an ORM. The only reason they want a SQL database in the first place is for ACID.
If you find yourself caring about data types or actually writing a query, you should probably setup an actual database server.
Yeah, Linux definitely has corporate sponsors. This is not a good rule of thumb.
React is also now owned by the React Foundation, so I also don't see why it would be problematic to contribute to it now that it doesn't (seem to) belong to Facebook anymore.
I think the anti-corporate angle is a bit extreme as it would rule out a large number of projects that are widely used.
If the project is truly open source and widely used by the community it shouldn’t matter if it is or was associated with a corporation. Contributing to it helps the public who use that project too.
To a degree. But the corporate interest is spread across enough organisations that it's much harder for the Linux kernel to reject a patch solely because it's good for business, whereas a lot of corporate open source projects - even those with an OSI approved license - will actively refuse to merge code that competes with their commercial offering or simply isn't submitted by a customer. Hashicorp already operated like this long before they switched to BSL. Unfortunately having a project owned by a foundation isn't a good indicator either, because I know of at least one Apache project where the entire membership is one company, the CEO is the project chair and code is sometimes just dropped into repos in one huge commit.
Why is EF regarded as such a good ORM? I've encountered countless bugs in different repos related to its stateful nature after many years in .NET. Personally I found it completely illogical for my ORM to maintain state. I just want it to hold my schema and build queries.
Are you referring to the change tracker? FYI you can have it skip tracking as the default (or per query), but when you actually want to make changes you better opt in with `.AsTracking()`.
Anyway, I've used EF at work for about a decade and I'm happy with it. I surely have blind spots since I haven't used other ORMs in that time, but some things I like are:
- Convenient definition of schema.
- Nice handling of migrations.
- LINQ integration
- Decent and improving support for interceptors, type converters and other things to tailor it to our use cases.
What ORM do you prefer, and how does it differ by being stateless? How does saving look like, for example?
Dapper can be a better fit depending on the scenario. It's dumb objects. You fill them yourself with actual SQL statements. There is no change tracker. You are the change tracker.
The main issue with EF is ultimately there is an expression builder that maps linq expressions to sql. This mostly works, until it doesn't, or it does but has strange generated sql and performance. If all you are doing is CRUD or CRUD adjacent then it's fine. But for some complex stuff you spend a lot of time learning the innards of EF, logging generated statements, etc. It is time better spent writing good sql, which something like Dapper allows.
Fair enough. We use Dapper for a handful of performance-critical queries. But I wouldn't want to use it for the 99% where EF works well. Just like I wouldn't want to hand-roll assembly more than where it's really needed.
And it's not just about performance. LINQ plays well with the same static analysis tools as the rest of C#. You know, type checking, refactoring & co.
EF hits you in the face right at the start with the massive convenience that it provides. And then the paper cuts start adding up, and adding up, and adding up.
Although the EF team has made huge progress towards keeping your entities persistence-unaware, it's still not enough and eventually you wind up building your project in Entity Framework just as much as in C#.
Being forced to compromise your domain model. Yes the product has improved this greatly in recent years but it’s still inadequate IMO.
Fluent syntax can at first seem like the product has achieved persistence ignorance nirvana but then you have to compromise a little here, compromise a little there, until some point, if you’re still thinking critically about the design, you realize that you’re writing your app in Entity Framework as much as you are writing it in C#, as I mentioned.
Passing around a large mutable blob (dbcontext) which, if not managed with the utmost discipline by your dev team, can make it necessary to understand what large swaths of the code do before you can adequately understand what any small part of the code does.
Yes, european roads are not as wide, since they make place for proper sidewalks and bike lanes. Another advantage is that narrower roads make drivers drive more carefully and slowly, reducing accidents even further.
In Japan many neighborhood roads (even in cities) are narrow and have no sidewalk to speak of. But I feel safe walking down them because drivers expect to go slow and look out for pedestrians and cyclists.
If you want to blow through an area fast, there are other roads for that with lighted crossings and sidewalks, and often slower mixed-use parallel roads for pulling in and out of businesses.
I've no doubt it varies, but they're all doing something differently that seems to work versus the US.
> Other countries haven’t seen this increase in pedestrian deaths: in every other high-income country, rates are flat or declining. Whatever’s causing the problem seems to be limited to the US.
Then we shouldn't really be talking about the US, which has similar size and population stats, but instead individual cities and states. Denying that US states are correlated and European city construction are correlated is to ignore the history of how they were made.
Not in my experience. The road widths were set hundreds of years ago and the buildings have not changed. Walking around European and UK towns I find myself much closer to cars than walking around in the US. This is a factor in keeping car speed low, which likely affects how often and severe pedestrian collisions are.
I do agree that European roads are safer, but not because they have made large segregated sidewalks (which is what I disagree with as reality and the root cause).
On small village roads with little traffic you don't even need pavements (not to mention bike paths) as long as the road is narrow and winding with good visibility. Cars drive slowly and rarely, it's perfectly fine to walk there.
Not the same, since if you buy Bitcoin, you don't have partial ownership of the machines used to mine and also these machines being used for a singular purpose which you cannot change.
I think in the end step by step functional programming will be widely adopted. Concepts that have a strong theory behind them tend to last longer. Similar to how relational databases and SQL are still the gold standard today.
> Concepts that have a strong theory behind them tend to last longer.
The programming paradigms (functional, imperative, declarative) as implemented generally don't have strong theories behind them. We have no sharp line to detect when a language sits in any of those categories for any major language apart from SQL. One can write something that is recognised as functional in an imperative language for example. And even the best example of purity, SQL, in practice needs to be hosted in an imperative or functional environment to get useful results. The declarative part of the language isn't competent at getting complex performance-friendly results.
One of the interesting things about being a member of the functional programming community is I genuinely can't tell what the claim to fame of the community actually is beyond a sort of club-like atmosphere where people don't like mutability. Are we claiming that C++ programmers don't know how to write pure functions? I hope not, they clearly know. Many are good at it.
I see this misconception a lot. Functional programming has nothing to do with mutability or immutability. Just like product types (or associative arrays) don't make for object oriented programming, the paradigm is about the patterns you use and the total structure of the program. In the case of functional programming, it's that your entire program is built around function composition and folds. This changes how you build your program on a fundamental level.
Additionally, functional programming does have a strong theory behind it. That of the lambda calculus. Everything is formalization and constructions built on top of that. We don't have a "hard line" to detect if a language is "functional" because it's not a language paradigm, but a property of the code itself. Grammar is orthogonal to this stuff. You can, however, make a pretty good guess by looking at the standard library of a language. There is no ambiguity that Idris or Haskell are a languages designed for functional programming, for example.
I think this stems from a problem that there are a lot of programmers, who through no fault of their own, have engaged with tools for functional programming, but skipped building things using the alien and weird stuff so in the end there's no actual learning of functional programming itself. Every new programming paradigm is a shift of your core perspective, they require you to relearn programming from the ground up. It shouldn't feel like "this is just ABC with XYZ", you are only truly beginning to learn when you are put back into your shoes as a middle schooler unlocking the power of nested conditionals. If you don't get that feeling, you're just learning a different language and not a programming paradigm.
> And even the best example of purity, SQL
Purity is not a claim of functional programming. Also SQL is not anywhere close to being a functional language, it's not only a non-programming language, but even if it were (and it can be with extensions) its paradigm is far more in line with logic programming.
> This changes how you build your program on a fundamental level.
I think this manifestation of cargo cult mentality should be put to rest. For each paradigm we heard the same story, but we still see programs in ${LANGUAGE_A} not even following idiomatic styles and looking instead like programs transcoded from ${LANGUAGE_B}. You do not change the way you build programs by rewriting projects in a specific language or even style. You just get the same people implementing the same programs according to the same mental model. The key factor is experience and familiarity, and you don't build any of that by marketing a particular programming style. Moreso when which framework of library was adopted by a project. The hard truth is that experience and familiarity derive from problems a developer faced and the solutions that worked, and quite bluntly after all these decades functional programming isn't living up to the hype created by it's fans. It's the curse of LISP all over again.
I'm not sure that your objection puts anything to rest, least of all a "cargo cult mentality". You re-assert exactly what I say just to disagree and miss the rest of the post. Codebases which are not written in a functional style are not functional programming, no matter the language being used. The overwhelming majority of programmers did not learn to program in a functional style, and without the long haul learning process of thinking about program construction in terms of function composition and application, they will program according to whatever they've always used. They can't program or use a mental framework that they don't know. Languages are languages. The syntax and grammar of a language has (virtually) no effect on a programming paradigm. There's nothing in the grammar stopping you from using Haskell as an imperative language. That's a knock-on effect of being a turing complete grammar, no more, no less.
Given these facts which are absolutely not up for debate, some statistical sample of codebases which are just some unstructured hodge-podge mudball in any given language isn't particularly meaningful. Somebody who spent 2 weeks reading Haskell tutorials in their off-time isn't going to be writing functional programs in Haskell. Hell, somebody who has been writing mudball Haskell programs for years isn't going to be writing functional programs.
> and quite bluntly after all these decades functional programming isn't living up to the hype created by it's fans
Perhaps your problem is that you are looking through such a lens that "hype" even registers on your radar. And that's why I'm sure you can't name a single weakpoint of functional programming off the top of your head besides the ever popular, ever erroneous complaint about "performance" by people who think functional programming is immutability, purity and function pointers. If all you care about is the social baggage with a working discipline and its tools, how can you possibly have anything but the most trivial, surface-level and, to put it bluntly, ignorant view of things? There is no substance to that nonsense, don't kid yourself. Until you change that you will never receive any signal, just noise. The heterogeneity of programming as a discipline will continue to frustrate you to no end.
"All over again"
Lisp is a tool explicitly designed for functional programming ;-)
To chime in to this discussion, I suppose there is a disagreement of what functional programming means exactly. For some it means Haskell or equivalent, for you it seems to be much laxer. Functional-programming evangelists tend to be on the stricter side, though.
When programming with “functional core with imperative shell”, is that functional programming overall?
In the projects I work, functional-style parts are commonly bookended by imperative parts both outwards and inwards. The inward parts are things like database calls, file I/O, OS state, or other remote service calls. The art lies in keeping as much code free from side-effects as possible, but it’s rarely only the “shell” that is imperative.
Regarding functional (de)composition and proper structuring, that was very much a thing in imperative programming from fairly early on. That’s not specific to a functional style.
I suspect that it depends on the context in which people first learn about it whether they associate it with functional programming or not.
And I think that a lot of the aversion to functional programming advocacy comes from advocates demonizing anything imperative, while in reality a mix of both is appropriate.
> Functional-programming evangelists tend to be on the stricter side, though.
> I think that a lot of the aversion to functional programming advocacy comes from advocates demonizing anything imperative
I find these types to often be more motivated by things like nicely designed grammars and powerful type systems more than functional programming itself. A convenient test is give someone over to a J codebase. If they principally learned in a functional environment, or otherwise mastered functional programming, they'll love it. If they just like having language features that aren't 50 years old, they'll want to tear their hair out.
I have seen academic-types that abhor the idea of a flip-flop. I'm not programming a blackboard, so I pay this no mind.
> When programming with “functional core with imperative shell”, is that functional programming overall?
The functional parts surely are. The imperative parts are probably not (since they have to be very granular). Why reduce it to a binary overall? A complex codebase is a highly entropic thing. How much does the functional core influence the non-functional shell? It probably varies, and would require a high dimensional manifold to accurately graph.
> Regarding functional (de)composition and proper structuring, that was very much a thing in imperative programming from fairly early on.
Imperative and functional aren't generally mutually exclusive in my eyes. You can use statements to structure programs in a functional manner. No problem.
Threaded code is relatively similar to functional programming, but it's also very dead unfortunately. Most people's introduction and exposure to it is exclusively through Forth toys. It was (and maybe still is?) a real assembly technique. You need better control over the callstack than you get out of most imperative languages these days.
Point-free style is a critical component of functional programming in my eyes. Far more important than purity. It's not exclusive to functional programming, but it's pretty much holding up half the sky. Without it, you get crap like recursively descending a list of function pointers and void* args. Not ergonomic.
I should also note that I would consider ENIAC's programming paradigm to be rather close to functional:
> I'm not sure that your objection puts anything to rest, least of all a "cargo cult mentality". You re-assert exactly what I say just to disagree and miss the rest of the post.
Not quite. I'm pointing out how your argument is based on a cargo cult mentality where throwing "functional programming" labels at practices is all that's needed to fix any and all problems. It is not. Functional programming has been a thing for decades and you still get software with the same rate of bugs and failures than your average codebase.
> Codebases which are not written in a functional style are not functional programming, no matter the language being used.
This is another predictable way on how the cargo cult mentality copes with the failures to live up to the extraordinary claims. You're pinning the blame of the bug-free cargo planes not landing on your makeshift functional programming airport because of how the cult of functional programming is being practiced wrong.
The truth of the matter is that programming paradigms are not panaceas, and developers still need to be mindful of software engineering best practices to minimize the risk of introducing problems.
Lambdas are now a requirement for any modern programming language. Monadic promises are how asynchronous programming is done. Rust is basically OCaml except with typeclasses and without a GC.
Inch by inch, ideas from FP bubble up into mainstream programming.
I predict that algebraic effects (a la Koka) are next, but it'll be a while before the industry is ready for it.
To play the devil's advocate: From an OOP perspective, lambdas are just syntactic sugar for a special form of objects. Regarding monads, we had smart pointers and the like in C++ already in the 1990s. Both concepts are orthogonal to mutability.
I'm convinced that imperative programming will endure alongside functional code, because being stateful is how the real world works, and most programs have to repeatedly interact with the real world over the course of their execution.
Sure. I don't mean to say that imperative programming is going anywhere.
If you're looking for programming languages with no support for imperative programming, Excel is pretty much it. Even Haskell has robust support for sequencing actions. If it didn't, I don't think we'd be talking about it at all.
What I predict is that ideas from FP will continue to bubble up into the mainstream. Like prior inventions, they won't be presented as a thing that asks you to rework all of your algorithmic code. They will instead be polished and presented in a way that makes them work more as extensions to what you can already do.
If you squint a little bit, Haskell's do-notation is already available in mainstream languages in the form of async/await syntax. async/await is not quite as general as the original Haskell solution, but it also doesn't ask you to completely rethink the way you design algorithms.
Over the last 3 or 4 decades, our procedural and OO languages have slowly been absorbing the best ideas from "FP languages." We're getting to the stage where the very idea of a "functional language" is eroding.
>> From an OOP perspective, lambdas are just syntactic sugar for a special form of objects.
Indeed Smalltalk - a pure OOP language - had `Block` objects fifty years ago.
>> I'm convinced that imperative programming will endure alongside functional code, because being stateful is how the real world works, and most programs have to repeatedly interact with the real world over the course of their execution.
That, and also who wants to wrestle with monad transformers to make a stateful computation work when they have to fix a bug in production ASAP?
> Indeed Smalltalk - a pure OOP language - had `Block` objects fifty years ago.
Although in the original Smalltalk-80, blocks were not full closures, so it didn't support all the things you would expect to be able to do with a lambda.
"The major commercial Smalltalk implementations of the 1990’s all corrected these problems and developed various techniques for efficiently implementing blocks."
The duality between closures and objects is well known. It's hard to avoid having some construct like that, but mainstream languages shifting toward the functional perspective on it is definitely still evidence for penetration of functional ideas.
python will be the last man standing with basically no functional goodies
people will keep on trucking with their "pythonic" for loops containg appends to a list that was initialized right before, their disgust for recursion, their absence of lambdas, the lack of monadic return types or containers
That seems improbable. Pure functional languages are very unpopular according to TIOBE. In particular, interest in Haskell seems to be in decline. Functional features are mostly just an add-on to programming languages with mutable state. By far the most popular language, Python, doesn't even have static typing.
I know powerful typing features are very important for Haskell in particular, but is static typing considered a functional feature more broadly? Most lisps don't have static typing as far as I know. Clojure was one of the main functional languages that actually saw industry use for a while, and it is dynamically typed.
It seems to me that algebraic nominal types are getting strongly attached to functional programming right now. Even on multi-paradigm languages, both idioms tend to come together.
It's not a necessary relation, those things are pretty much independent. Except in that pattern matching and function composition complement each other very well.
"but is static typing considered a functional feature more broadly"
It might be, but it is not an essential feature. Functional programming is the practice of using pure functions as the primary form of computation and composition. Whether one adds types to this is as relevant as adding types to any other paradigm (imperative, oop, logical, functional, etc)
Right, that was my assumption. I asked because the person I replied to mentioned the popularity of dynamic languages as a data-point for the decline in popularity of functional programming.
That's already been happening for quite some time. Mainstream programming has done little else in recent years than converge toward functional programming.
Also, they wrote "functional," not "purely functional."
By the way, from a formal perspective, imperative programming is an extension of purely functional programming.
This is true, in that most of the scholarship builds up its proofs starting with the lambda calculus. But there are so many paradigms (Turing machines, SKI combinators, excel spreadsheets) that are equivalent that I’m not at all convinced they had to start with lambda calculus. They just happened to.
Out in the real world, the thing that all programming languages are actually built on top of looks much more like a Turing machine than a collection of composed anonymous functions. But of course if you want to make your programs go really fast, you can’t treat them like Turing machines either. You need to acknowledge that all of this theory goes out the window in the face of how important optimizing around memory access is.
Which isn’t to say one perspective is right and one is wrong. These perspective all exist and have spread because they can all be useful. But acting like one of them is “reality” isn’t all that helpful.
Ps. Not that the parent actually said the formal perspective was reality. I just wanted to articulate this thought I had bouncing around in my head for a while.
> Out in the real world, the thing that all programming languages are actually built on top of looks much more like a Turing machine than a collection of composed anonymous functions.
Hardware logic as described in a HDL language is precisely a collection of "composed anonymous functions", including higher-order functions which are encoded as "instructions" or "control signals". We even build stateful logic from the ground up by tying these functions together in a "knot", with the statefulness being the outcome of feedback.
But it's hard to argue the machine at the end is stateless. We can endlessly do this. You can construct lambda calculus with Turing machines and Turing machines in lambda calculus.
There seems to be this weird idea in the functional community that the existence of some construction of one thing in another shows that one of those things is "more fundamental" than the other, when in reality this is often a circular exercise. e.g. Functions can be formalized as sets and sets can be formalized as functions.
Even worse in this specific case, the Church-Turing thesis tells us that they're equivalent, which is the only sensible answer to the question of which is more fundamental. There's an oft quoted phrase of "deep and abiding equivalencies" and it bears pointing out how deep and abiding these equivalencies are. From a formal perspective they are the same. Yes, there's arguments could be made that typed lambda calculus and its relation to logic are important, and that's true but it's not a formal argument at all and I think it's best to be clear on that.
> You can construct lambda calculus with Turing machines and Turing machines in lambda calculus.
I realize that these models of computation are equivalent. My point was rather that the imperative paradigm collapses into the functional paradigm in practical programming when I disregard the admissibility of arbitrary side effects.
> e.g. Functions can be formalized as sets and sets can be formalized as functions
I can derive functions from set theory in my sleep, and I can kickstart set theory without functions, but I wouldn't know how to define the concept of a function without sets. And even if I did: I can't even specify the characteristic function of a set without resorting to the inclusion relation.
> But it's hard to argue the machine at the end is stateless.
I'm not really that interested in the relationship between the various paradigms and the machine. What interests me most is how well I, as a human being, can write non-trivial programs. To me, it is immediately obvious that the composition of purely functional program units is conceptually simple enough to be done by a child, while unrestricted side effects can very quickly make things very complicated. However, I don't want to get involved in the discussion on this topic. I have accepted that others see it differently, although I find that completely baffling. I don't want to take away anyone's for loops, etc. To each their own.
> I realize that these models of computation are equivalent. My point was rather that the imperative paradigm collapses into the functional paradigm in practical programming when I disregard the admissibility of arbitrary side effects.
But in practical programming with imperative languages, arbitrary side effects can't be disregarded, so they don't collapse into the functional paradigm. In fact, from a physical perspective, every possible CPU has states, so the most physically fundamental model of computation (something like register machines, or GOTO programs) is imperative and more fundamental than functional models, like untyped lambda calculus. Functional models might be more mathematically elegant though.
> I wouldn't know how to define the concept of a function without sets.
Whitehead and Russell showed how to define functions just in first-order logic with identity, without requiring any set theoretical axioms, by defining an n-ary function via an n+1 place relation. See here top right: https://mally.stanford.edu/Papers/rtt.pdf
This is quite natural, because predicates (properties and relations) already occur in natural language, while sets do not; they are a mathematical abstraction. For example, sets can be empty, or arbitrarily nested, or both arbitrarily nested and otherwise empty, which has no analog in natural language.
> I can't even specify the characteristic function of a set without resorting to the inclusion relation.
If you try to define sets by using functions, functions are in this context assumed to be more fundamental than sets. Then you don't need to define functions. Then the inclusion relation is simply defined via the characteristic function. You don't need to define that function. Just like you, in the reverse case, don't need to define sets, if you want to define functions via sets.
> But in practical programming with imperative languages, arbitrary side effects can't be disregarded, so they don't collapse into the functional paradigm.
I'm sorry to have to say this so bluntly, but I think you understand as well as I do that in a language such as C#, it is entirely possible to write large amounts of purely functional yet useful code, just as you would in Haskell. That's why it's possible in SICP to wait until Chapter 3 to introduce the special form set!. That is the issue I was concerned with.
> from a physical perspective
I already mentioned that this is not the perspective that interests me. I don't care at all about the physical substrate for computation.
Thanks for the paper. I might take a look at it, although I've already been given a good tip elsewhere with map theory. I'm not convinced by the claim that properties and relations occur in natural language but sets supposedly do not.
The last paragraph isn't very helpful either. I'm not sure who is misunderstanding whom here, but we don't need to hash it out. This isn't a conversation I'm enjoying.
Thanks! It seems that in the metatheory, one can resort to type theory in order to avoid having to fall back on set theory in a circular manner. Unfortunately, I don't know anything about that, but I'll take a closer look at it.
Good point on functional vs purely functional. To the GP, what we're seeing is that more mainstream languages are taking the most popular features from functional languages and integrating them. The combination of Scala & Rust are a perfect example given how ML inspired they are. C# has a bevy of functional trappings.
Haskell is suitable for, and designed for, bleeding edge experiments, not for practical usage. Its low popularity says very little about the "market penetration" of better engineered functional languages.
There is no other popular functional language either. Except if you count imperative languages which let you, optionally, write functional code, except that in practice most code is imperative.
> Haskell is suitable for, and designed for, bleeding edge experiments, not for practical usage
The notion of practicality depends on what one wishes to practice. You, for instance, are practicing FUD spreading, it's practical for you to say these things without actually doing work with the tools provided.
I tried to learn Haskell, and while the language has merits every library I attempted to use was a problem: at best not documented enough, but more often a half-baked proof of concept.
I know Robert Harper has advanced that argument, but I think it's only really interesting for striking a single rhetorical blow: that, uh, actually statically typed languages are trivially more expressive than Python, since Python has one static type and they have as many as you want.
But I think as an actual ontology, as an actual way of understanding and structuring the world, it's obviously quite silly. There's no compile-time (static) type checking phase in Python, because it'd be trivial and therefore a waste of time. Without static type checking, what does it mean to say it's "statically" typed? Moreover, the language has an entirely different, extraordinarily related concept of type which is checked dynamically. Yes, you can say that "Python is a statically typed language with only a single meaningless static type and but also dynamically checked dynamic types", but frankly that's not telling you anything that calling it a dynamically typed language isn't.
From a different angle, you can't program SML or something "monotypically" - just restricting yourself to one type - without effectively building an interpreter for an entirely separate, dynamic language inside of it (you can of course build a statically typed language inside of Python, so if you think that counts you're just stripping the meaning from comparing languages). In that sense, Python's just plain doing something fundamentally different from what "a static language with one type" is.
These so-called dynamic types are merely the equivalent of tags in a discriminated union/variant type. Statically-typed languages can easily do the same thing: the argument that this amounts to "building an interpreter" applies to any language implementation.
> These so-called dynamic types are merely the equivalent of tags in a discriminated union/variant type.
That's far more true in a language like JavaScript or Scheme than in an "everything is an object" language like Python; the only reason why you would need a variant type for PyObject is to avoid the circular data structures the actual implementation uses.
If you allow the circular data structures, your dynamic types instead are "merely" a relatively complicated codata type, but it's far less obvious that this is actually what anyone considers to be "merely."
> Similar to how relational databases and SQL are still the gold standard today.
...except they aren't. The world is gradually slipping towards nosql databases and stepping away from normalized data stores, to the point where tried and true RDBMS are shoehorning support for storing and querying JSON documents.
All flagship database services from major cloud providers are nosql databases.
Perhaps you're confusing pervasiveness with representing gold standard. Yes, RDBMS are pervasive, but so are non-compliance with the standard and vendor-specific constructs. The biggest factor was not what was the theoretical background behind each implementation but ease of use and ease of adoption. SQLite is an outstanding database not because of its compliance with SQL standards and adherence to theoretical purity, but because it's FLOSS and trivial to setup and use.
I expect the same factors to be relevant in adoption of functional programming. This stuff exists for decades on end but mass adoption was between negligible to irrelevant. What started to drive adoption was the inception of modules and components that solved real-world problems and helped developers do what they wanted to do with less effort. In JavaScript the use of map, filter and reduce is not driven by any new-founded passion for function programming but because it's right there and it's concise and simple and readable and capable. In C#, linq is gaining traction for the same reasons but is largely kept back by performance issues.
The key aspect is pragmatism, and theoretical elegance is by far a mere nice-to-have.
> ..except they aren't. The world is gradually slipping towards nosql databases and stepping away from normalized data stores
The 2010s called and want their database paradigm back.
NoSQL is definitely not the zeitgeist today. Graph DBs, KV stores, document DBs, and OLTP DBs are what is popular today- I would say that the two shifts I see in the 2020s is the rise of cache-as-database (eg redis) and an “all of the above” mentality rather than one-size-fits-all.
> The world is gradually slipping towards nosql databases
My observation has been that this pendulum is swinging back the other direction. You're correct that some NoSQL-isms found their way into RDBMS, but at least in the circles in which I travel, people are by and large heading back to RDBMS stores.
reply