I’m sure, doth the Astrumants should survive the landing, there should be a way to return, and they need a shitter as part of the missed requirements. As it’s a waterfall, that will come in the second, third, and fourth trips.
I’m sure, doth the Astrumants should survive the landing, there should be a way to return, and they need a shitter as part of the missed requirements. As it’s a waterfall, that will come in the second, third, and fourth trips.
Oil and gas products account for 4.2% of Sweden’s exports. The gas exports alone almost rival those of dairy and eggs! Truly a petrostate if I ever saw one
Well the largest category is
I would say yes and no, but yes the clone command can do it. But branching and CI get a bit more complicated. Pushing and reviewing changes gets more complicated to get the overview. If the functionality and especially the release cycle is different the submodules still have great values. As always your product and repo structure is a mix of different considerations and always a compromise. I think the additions in git the last years have made the previous really bad pain points with bigger repos less annoying. So that I now see more situations it works well.
I always recommend keeping all testing in the same repo as the code that affects the tests. It keeps tracking changes in functionality easier, needing to coordinate commits, merging, and branches in more than one repo is a bigger cognitive load.
Yet, that is the real Linux phone killer move by the former, Microsoft CEO for Nokia. Also the move that killed Nokia phones.
It’s also easier to work if one simple git command can get everything you need. There is a good case for a bigger nono-repo. It should be easy to debug tests on all levels else it’s hard to fix issues that the bigger tests find. Many new changes in git make the downsides of a bigger repo less hurtful and the gains now start to outweigh the losses of a bigger repo.
I agree that in most cases it’s more of an E2E or integratiuon test, not sure of the need to split into different repo, and well in the end I’m not sure that would have made any big protection anyhow.
The Windows Lumina phone was a battery-draining buggy version with a smaller screen than the Linux version N9, When they turned that into Windows (Lumina 800), they had to use a smaller screen and less memory as Windows couldn’t handle the hardware as MeeGo could.
I’m not sure if Yggdrasil or Slackware, which we tried out at the old university computers. But quickly Debian became so much more flexible.
Sais no-one that knows vim, thou it have a vi-like mode that is missing most advanced vi-trixs.
Or in the office, the hardware-software relations between the laptop and Windows and in some parts Linux are strained at best, where drivers, power management, and so on get crappy. E.g. after a year or two of updates, it gets out of control and nice things like hibernations don’t work. It’s usually a driver for some small thing you don’t care about that forgot to read the Windows specification change and now it can’t do that power handling in a good way. Oops the computer refuses to sleep and your bag is burning, your battery is 1% when picking the computer up again.
I’m not sure, many developers use mac to get working unix tools and working “enterprise” tools at work like Teams and other crap that the company uses for “everyone”. Sadly many of these tools work like crap on Linux and maybe in best case the web-version is workable.
LiveNation has fixed the issue that artists got paid for playing live, not the companies can take that cut as well.
Imho on any server today all editors should be removed. You edit on your workstation and provision to the server.
I prefiere using tools like ansible or terraform, but I write the code for it in a GUI from jet brains. Then I deploy from CI, using git from the command line.
A good IDE also helps you make better refactoring, making the code so much easier to read. The main goal of any code.
I blame the rise of frameworks, libraries, and IDEs
Without good libraries and frameworks, we can hardly get any software working in today’s environment. We get stuck with a slow development cycle and have software that doesn’t do what the users want of it. A few years ago, I was at a customer using an old Linux distribution at their customer’s site. For contractual reasons that was not upgrading to the latest version, they had skipped keeping up to date with changes as they came. Every step of development became a hassle and the good programmers there were not able to deliver features at any predictable rate. There were issues with HTTPS, most webservers of today mandate at least TLS1.2, but when the OS only supports SSLv2 and SSLv3, and TLS1.1, connecting to the internet, well gets hard.
Having to develop all functionality from the ground up, makes no features needed by the customers ever released. With most developers I have worked with using good libraries also makes the implementations less prone to have serious bugs in them.
I assume you never worked in testing. back in the days, we used to cram testing into a weekend as developers were late with their coding. There was no test automation so that weekend we spend all the time on the most basic functionality. Barely getting thou the testing of having the app started and some of the most basic functions. Almost never was there any time for regression testing, old functions broke all the time. It wasn’t uncommon that we skipped a bug fix in one version, just to reintroduce the same bug in the next release.
It seems to me that the author doesn’t remember all the struggles we had back then with bugs and features not working. And masses of needed functionality that never got skipped into the hands of users. It also strikes me that maybe there is a bit of nostalgia, just a bit of reluctance to change his ways. He found a workflow around the missing functionality that might be blocking for others and he has a harder time adjusting to the new functionality.
A bit like my father that refused to change his workflow, to make images for webpages (all static) he used for different Amiga programs because one could scale the images, one could edit them add lines and stuff, one for helping him make image maps, and they one so they could be converted to jpg/png as anim files used by everything else on the amiga didn’t work well on the internet.
Bug testing back then was awful, we never had time to catch any issues but the biggest. The time plan for the release was fixed years ahead, the functionality that was needed was fixed years ahead. All the needed time for testing was eaten up by the developers working into the final skip to customers, trying to make the software actually run. It wasn’t uncommon for test teams trying to cramp months of eating into a weekend to have the software skipped on Monday morning. Well including masses of needed bug fixes during that weekend that no one knew what code each issue was actually tested on. Remember that software version control system was almost not used, there was no CI build system all all software was built on some random developers workstation. Maybe, with some additional changes for his or her convenience. No software development has come a long way since the 90s. A very long way!
JetBrains, the refactoring tools are much better than any alternative, and that is a great productivity booster. Also, it has excellent remote support. Mainly at the moment, I’m using pycharm and clion.
Docstring are user documentation, not comments. User documentation, with examples (tests), is always useful.