Yes, software is getting worse, as education and corporate are getting worse.
Where employees needed to know what they actually were doing in the past, now is mostly auto-filled by IDE’s and languages that target other languages, so employees need to know less and less fundamentals.
Which in turn means when a low-level error occurs, either no one knows how to fix it, or the corporate refuses to hire someone who knows how to fix it because they’re “over-qualified”, and therefore would “cost them too much”.
Do you think complexity and scope stayed the same? Or did it increase? Do people have to know more now to have the same level of depth and surrounding knowledge?
I’d say no. While yes for example in game development we’ve had new tech come up that wasn’t there 10-30 years ago, the “how” to do it was on paper decades earlier.
It just wasn’t feasible to implement with current technology.
Due to IDE’s etc, it’s significantly easier to just create stuff these days, which for indie etc is extremely good.
It does however also mean that the implementation of tech X will be sub-optimal in most situations, because people don’t really understand the underlying tech.
That can be solved in non-corporate situations by asking for help/advice online, or looking it up; but in corporate that’d likely get you branded “overqualified”, and they’d fire your ass for focusing development time on improving/fixing something instead of just pushing, pushing, and pushing.
'course there are also programming fields specifically targeting to improve gaps left by IDE’s etc, to make them even easier and efficient to use.
So basically: Fuck big corpo, fuck “education” that prepares you for corporate rather than teaches you the fundamentals.
Yes, software is getting worse, as education and corporate are getting worse.
Where employees needed to know what they actually were doing in the past, now is mostly auto-filled by IDE’s and languages that target other languages, so employees need to know less and less fundamentals.
Which in turn means when a low-level error occurs, either no one knows how to fix it, or the corporate refuses to hire someone who knows how to fix it because they’re “over-qualified”, and therefore would “cost them too much”.
Do you think complexity and scope stayed the same? Or did it increase? Do people have to know more now to have the same level of depth and surrounding knowledge?
I’d say no. While yes for example in game development we’ve had new tech come up that wasn’t there 10-30 years ago, the “how” to do it was on paper decades earlier. It just wasn’t feasible to implement with current technology.
Due to IDE’s etc, it’s significantly easier to just create stuff these days, which for indie etc is extremely good.
It does however also mean that the implementation of tech X will be sub-optimal in most situations, because people don’t really understand the underlying tech.
That can be solved in non-corporate situations by asking for help/advice online, or looking it up; but in corporate that’d likely get you branded “overqualified”, and they’d fire your ass for focusing development time on improving/fixing something instead of just pushing, pushing, and pushing.
'course there are also programming fields specifically targeting to improve gaps left by IDE’s etc, to make them even easier and efficient to use.
So basically: Fuck big corpo, fuck “education” that prepares you for corporate rather than teaches you the fundamentals.