Comments by "John Smith" (@JohnSmith-op7ls) on "Thriving Technologist"
channel.
-
The worst is the “Team/Tech Lead” role, which is nothing more than a cost cutting role. Forget what it’s supposed to be, in practice at most places, it’s a senior dev, mixed with software architect, team management, and project management duties.
Every time I’ve seen it, you’re expected to do a full week of coding each week, have meetings that should be done by a full time manager (1 on 1s, performance reviews, higher level manager/director meetings, project manager meetings) on top of your team meetings, doing software architecture research and decisions, even dev ops depending on just how deep they’re trying to cut costs.
You end up just being a dev who takes on the work of PMs and directors so that the company can get by with fewer PMs and directors. They can have a PM handling 5-6 projects, doing little more than gathering status reports and passing them up the chain, while the team/tech lead becomes the actual PM.
Oh and it’s usually like a 10-15% pay increase for like 25-30% more work and stress.
27
-
15
-
11
-
7
-
5
-
Tech lead is a cost cutting role, period. Ultimately I’ve never seen it produce a net gain. It saves money on paper, but in practice, you compromise the quality of the development they do to try and reduce the number of project managers and/or directors.
The development side of the job is constantly hampered by admin tasks, admin related meetings, emails, IMs, calls.
It sounds like it would just be a developer who handles code reviews, software architecture, ensuring things are documented, settling disputes over technical matters, development related mentoring.
But every time, it ends up just being a part time senior dev who does some of that stuff I mentioned, and then does a bunch of things the project managers should do, and a bunch of things department managers or directors should do.
Things like scheduling PTO, performance reviews, sitting in on endless status meetings with department heads, creating documentation and presentations that really should be done by PMs.
It’s like how the medical industry made up “Nurse Practitioners” to save on paying actual doctors.
5
-
5
-
4
-
4
-
3
-
3
-
3
-
3
-
3
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
1
-
1
-
1
-
1
-
@TheLayeredKing Even in North America, even with those without higher education, it’s been common since the start of colonization to identify by your job. After all, the Americas were colonized by Europeans who were doing the same before that.
Basing your identity around considering yourself part of a group which is defined by so,e immutable physical characteristics has nothing to do with economics. It’s a sociopolitical trend pushed by the reframing of economic Marxism as social Marxism and has roots in the work of the founders of the Frankfurt School and some of their earlier compatriots in pre WWII Europe.
People didn’t do this during the Great Depression, or the earlier Great Depression, or during the collapse of the Weimar Republic, or Rhodesia/Zimbabwe, all of which were economic downturns that make the current situation in the US look like nothing. I could give more examples, from, the economic horrors of Mao’s Cultural Revolution, to Leninism, Stalinism, The Khmer Rouge. Why didn’t people adopt racial, gender, and ethnic group identity politics in any of these cases of economic hardship if that’s just a natural thing to do when the economy is bad?
1
-
1
-
1
-
This is what companies should be hiring on unless it’s a short term hire for a specific project and there’s no time to get good at the given tech stack.
But companies just don’t understand this. Managers don’t, HR obviously doesn’t. They act as if they’re always hiring people for 1-2 years, will never be changing their tech stack.
They don’t value a high capacity for rapid learning with a sold base of critical thinking, soft skills, common design pattern knowledge, all things which are critical to building a good team.
It’s no wonder that companies like Google which focus so much on pointless memorization of algorithms you’ll probably never have to code or even think much about, only have devs stick around for like 2 years on average.
This insane talent churn is internally rationalized as how to find the best talent and get rid of the rest. But it tends to do the opposite, as the best talent doesn’t have to put up with this nonsense of 8 stage interviews, constant performance reviews, unnecessarily competitive environments, and the constant looming threat of being fired for not outdoing everyone else.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@HealthyDev I think the definition you mentioned is the original, which then got turned into, “Anyone who doesn’t go above and beyond.”
But I’d ask, what’s wrong with the bare minimum? It’s nice to get more than what you’re entitled to, but it’s a slippery slope when “nice to have” becomes “required”.
They’ve done studies with tipping at restaurants and show that tipping more doesn’t give better service as servers just raise their expectation of what the deserve for the service they already provide.
There’s also the issue that most places don’t have a defined, straight forward path for upward mobility. It’s usually playing office politics and/or waiting for someone to quit/die/be let go.
Putting in extra effort has to lead somewhere or eventually people will stop trying.
We can say people should always drive to do their best without reward as it’s a virtue and leads to personal growth.
But this tends to be more propaganda than anything else, an attempt to get more work out of staff without more pay, on the grounds that their reward will come in some form of personal growth and satisfaction.
The corporate version of, “If you lead a good life, you’ll be rewarded in heaven. Not now, not by us, but later, in heaven, by God.”
1
-
1
-
1
-
1
-
1
-
1
-
1