Created.
Must go off to work now, so haven’t thought out stuff to start with, except:
Python > C/C++/C#, depending on context.
Linux >> Windows, depending on context.
Created.
Must go off to work now, so haven’t thought out stuff to start with, except:
Python > C/C++/C#, depending on context.
Linux >> Windows, depending on context.
Moved to Random Fun as the Debate room’s focus is a/theism.
Guess I was a bit too quick on the trigger there… ![]()
For a hobbyist, that’s perfectly reasonable. For a professional, it’s incumbent upon them to take care not to create situations that result in failure. Part of my job as a team leader, and later as a manager, was to ensure everyone on the team was qualified and took appropriate measures to make sure their code was technically correct and without security issues. We also performed code reviews to catch errors the original developer may have missed, and our test teams also added another layer of verification on top of that.
It’s my opinion that languages like rust are band aids on the real problem–programmer carelessness and incompetence. Many of the common issues (stack overflows, memory leaks) are so well known and understood that there’s no excuse for anyone to make them anymore.
For new development, rust does make sense, as long as the team is adequately trained in its use. Making an experienced team use a language they just learned has its pitfalls too, as people tend to make more mistakes with tools they don’t have much experience with.
Taking a large existing codebase written in, for example, C, and recoding it in rust is an exercise in futility. It’s almost guaranteed that new bugs will be introduced into the new codebase that were painstakingly found and fixed in the old codebase over the years. I’ve seen this happen time and time again during my time in the industry, and it was never pretty. The only reason to completely rewrite an existing codebase is if that codebase is hopelessly buggy and poorly organized. Doing it for nearly any other reason is a waste of time and money.
Agreed. That would even be a reason to rewrite it in the SAME language. The language isn’t really the issue, it’s the architecture / design / implementation discipline that’s at issue.
This is why I wonder whether Microsoft embarking on this program to rewrite much of Windows in Rust isn’t misguided. The rationale seems to be that Rust has magic fairy dust that prevents security vulnerabilities, but even if that’s true relative to existing C / C++ code, it has no magic fairy dust to prevent the reintroduction of old bugs, the existence of which is poorly remembered or understood after all these years.
Those are wise words, and you explain exactly the reason why the bank, finance, insurance, and other transactional based use cases keep their old Cobol code that has been developed and debugged over decades. And why big numerical codes like weather forecast codes or complex simulation packages developed over time still keep their Fortran cores. And why e.g. standard linear algebra libraries like LAPACK still follow Fortran 77 syntax.
One can complain and be smug about how Cobol is obsolete (although Cobol does have some very redeeming features regarding e.g. complex filtering of tables that put other languages to shame) and how Fortran is 1970s legacy (actually, modern versions of Fortran are remarkably agile and expressive when you’re doing multi-processor number crunching). The biggest complaints I have about them are really that they have a somewhat old-fashioned syntax. But for doing numerical stuff, I will much prefer the relatively clean (although old-fashioned) syntax of Fortran before e.g. C++ with its complex line-noise like typing, especially when doing templates.
In my line of work, I write code for a very select niche audience. I do explorative programming, prototyping, writing software that start out as one-off data analysis programs and slowly evolve into more general analysis packages, and sometimes needing to do interactive work for figuring out e.g. how that totally non-standard and undocumented protocol or file format works (or I guess you could call it problem solving and detective work), then memory management is an unwanted distraction. So for these reasons and for the past couple of decades, Python has been my language of choice, and it is of a hell of a lot more utility for my purposes than e.g. C++ or similar languages.
I have always been attracted to Python but never had the opportunity to do any paying work in it.
I’m given to understand its weakness, historically, has been multi-threading and the Global Interpreter Lock but there’s a lot of change in the wind now around that.
Have you run into any practical performance issues where you had to drop into C or use a C library to work around it?
Yes, the GIL can be annoying and limiting if you need multi-threading. But there are techniques you can use to emulate or mitigate some of it.
Not really. The code performance is really not important in the big picture, but rather its utility. If things go too slow, it’s more an indication that I use a wrong approach or wrong algorithm. And whenever I need to do numerics with relatively big matrices that I smash together, I use the standard Python package numpy (which is written in a combination of C and Python) for the heavy lifting. Although I’m not really doing that sort of work, numpy and pytorch are at the very core of artificial neural networks in the AI world. There, pytorch and the GPU do the real work, while Python is the glue that connects the high-performance pytorch blobs.
The biggest downside with Python from an application developer’s perspective is probably that it is JIT-compiled into byte code and executed in a virtual machine. Which means that once you have working code that you can hand off, you cannot just easily compile it down into a distributable binary blob. Instead, you distribute it as a source code package.
I’ve used Cython quite a bit in the past when I needed speed to do lots of calculations that was just too heavy for standard Python. It might have been what started me down the path of favoring static typing.
Question! If this board ever gets overrun with Theists.
Will we rename it to Theist Republic?
Then we can just scare them away with geeky science and tech talk. Worked wonders for me at university when I wanted scare away people I didn’t want to interact with, or to end an annoying discussion quickly.
Love it!
There have been days where I wrote one line of code and it was transformative for my client because it broke some logjam or other that had the whole team tied up in knots. Nothing beats looking over a junior’s shoulder after they’ve been pulling their hair out for hours over some bug, pointing to a line of code and saying, have you considered changing X to Y here? And it ends the problem. After over 40 years in this business you can smell the problem out where others are raging that “this just can’t be happening when I did everything right” rather than “I have to find where I went wrong here because it’s not working”.
It’s not always seniority either; sometimes it’s the need for a fresh perspective. Anecdote I read yesterday in the realm of DevOps: a system deployed and worked perfectly in staging but blew up in production. It had the team of Docker experts scratching their heads for a couple of weeks and then a relatively junior dev with a more systems thinking approach mapped out the whole deployment workflow and spotted a problem with something further up the toolchain from where the Docker images were generated, that was using an encryption method not expected by production. Fixed the whole thing in a few minutes and became a hero, but it was just one line of code following a few hours of diligent documentation work and – ahem – thinking.
There’s nothing wrong with older languages like COBOL and FORTRAN for specific tasks. The big problem with COBOL isn’t the design of the language–it’s the aging of its developer base. Guys proficient in COBOL have retired or are retiring and younger guys show no inclination to want to learn it. Fortunately, a lot of that legacy code is decades old and thoroughly debugged and very stable. Recoding COBOL apps in Java doesn’t make economic sense most of the time.
A similar situation exists with FORTRAN. Most of the FORTRAN users I’m familiar with use it for scientific computing and are scientists, not engineers. FORTRAN is perfectly adequate for their purposes and there’s lots of existing code and libraries specifically tailored for their use.
Before I retired, I used C and assembly language as my field was hard real-time embedded work. Newer languages like C++ may have made certain things easier to code, but was overkill in most of the scenarios I worked in. It was difficult to hire entry level embedded developers because most of the people coming out of universities only had exposure to C++ and similar languages in environments with massive amounts of CPU and memory. They just couldn’t get their heads around writing code for a machine with a 32 MHz MCU with 32K of FLASH and 8K of RAM. They always wanted to create massively abstracted applications using design patterns with stuff like AbstractFactoryConstructorDelegationVisitorSingletonFactory. That might work okay on a machine with a 4 GHz CPU and 32GB of RAM, but in the embedded space its a nonstarter.
Now that I’m retired, I can do pretty much whatever I want. I’m learning Swift and using it to write GUI applications for MacOS.
Modern hardware luxury! I first learned programming on an 8-bit microprocessor with a ≈1MHz clock, with 32K RAM and no flash storage. First a crappy variant of Basic, then assembler. My biggest project in assembler on that computer was a toy compiler for a really crappy toy language. I had no idea what I was doing, but it was a lot of fun.
I wasn’t far behind you. 4 MHz, 64K RAM and floppies, and not long after my first 5 megabye hard drive at a cost of $2K and bank-switched total of 128K RAM.
I fired up the old OS on a simulator not long ago thinking it might be fun to write software under those constraints once again but the tools were so buggy / crappy that I guess I have been spoiled less about faster and bigger systems than just really solid compilers, etc.
I’m often nostalgic for old computers and operating systems. Recently I installed a PDP-11 emulator and installed RSTS/E on it and compiled and ran a planetary orbital elements application I wrote in Fortran almost 50 years ago. It ran a lot faster on the emulator than it did on a real PDP-11/70 back in the day.
Next on my list is running CP/M on a Z80 emulator.
Same experience here, in fact for things like games on timing loops you have to slow it down to the actual original clock speed for it to work correctly.
For a time in the 80s I ran a word processing service on the side (customers were mostly print shops) and I almost bought out another word processing service that was using word processing software on a PDP-SomethingOrOther but decided it was too pricey and also sensed (correctly, as it turned out) that the opportunity window for that kind of work was closing anyway. Once enough individuals had word processing software and laser printers, the demand for having your resume or employee manual professionally typed and formatted collapsed.
Now that I’m retired and have lots of free time, I’m going to start getting some vintage hardware and getting it running. The golden age for this was in the late 1990s when vintage stuff like original Macs, old S-100 bus systems, and the like, were going for peanuts, and even free in some cases, because people just wanted to get rid of the old stuff taking up space in their garage. Nowadays there’s more interest in vintage hardware and it’s going for much higher prices on eBay and other sites.
Mr. Spouse has a VIC-20 buried somewhere in the basement.