Comments by "" (@JinnGuild) on "Continuous Delivery"
channel.
-
I'd want to add to the comment in this video about documentation -- Using TDD, we should absolutely be covering each explicit test case regarding the functionality of our code. Whether we're talking about the TDD a dev does using Unit Testing, or the TDD the team (Or SDETs or Automation QA etc) does with automated integration tests (functional, contract, whatever). Those tests describe exactly what is expected of the code, and those tests MUST CHANGE WHEN THE CODE CHANGES. Tests themselves are a form of documentation. Also, if we link test cases, cucumber, BDD, blah blah back to our User Stories, or Tickets, or other feature documentation, then there is a direct link of how the code is implemented to the feature being requested. Not to say that replaces 100% of code comments or other documentation, but it supports a massive amount of that requirement.
16
-
4
-
Part of the problem is that there is rarely such a thing as "Agile Scrum". Scrum by definition requires iterations. Iterations are GREAT! But mainly related to the Teamwork portions, the SDLC in general, if you will. Unfortunately, Scrum conflates that with Iterative Releases as well. That is, Release every couple weeks or so. Though that may match the lofty goal of 22 years ago, that does not match the current state of what "Continuous Delivery" means. Also, Scrum requires you to perform an abhorrent amount of planning and analysis.
Though I have worked with a single client where they "started with Scrum" and, by 100% definition of Scrum (as I remember it from ScrumMaster training 15 years ago), they altered the process until they were no longer doing Scrum. So at some point a few months after starting, they fell into their own Agile stride, and it was wonderful, and Scrum helped them get there. But what they ended up doing was nothing like you'd see if you researched what Scrum is.
4
-
3
-
@brandonpearman9218 Long story short, yes, that is exactly what they were saying. And exactly as you mention, "Continuous" back then was LOFTY to imagine every couple weeks or months. That was like imagining a Gigabyte of RAM in the 90's!! But just as we have home machines with tens or hundreds of gigabytes of ram now, we also have redefined "Continuous" to literally mean Continuous. No time frame. Just whenever something is done, it goes!
If a team 22 years after the Agile Manifesto was written is literally delivering every few weeks because that was the standard set by the Agile Manifesto, they are obviously a couple decades behind the curve.
Similarly, when they talk about Face to Face communication, keep in mind they didn't have Smartphones with 5G video, or Slack. They were still working on paper and using E-Mail. "Face to Face" absolutely includes Emojis and Chat today, though I'm sure we'd all agree it absolutely includes Meet/Zoom/Teams/etc.
I am actually disappointed in this video for not pointing these things out more clearly.
3
-
@Dave - Yes, yes and also yes. I especially want to "Yes" at your @16:53 -- Correct me if I'm wrong, but the pattern you gravitate toward is ISFC (Imperative Shell Functional Core). You refactor to allow for Dependency Injection, then you separate out the "Impure" stuff like hitting external SDKs or Databases, and you isolate the functional pieces to be as "Pure" as possible. Maybe you can't reach perfect ISFC, but that's essentially the direction you go in, would you agree? In my own consulting, I find it valuable to make it clear to people new to Unit-Testing TDD that code resulting from TDD are best observed as something like "Pure" functional methods. When your Unit-Testing TDD is finished, you then wrap the resulting great functional-esque code with an imperative shell that USES your logic, but composes it to and from impure calls.
As my question to you moves toward a dark space that sounds like "TDD never touches a database", I also want to clarify on that. When I talk about TDD, I assume that people hear my words as talking about Unit-Testing prior to writing "Domain Logic". But I almost always clarify that TDD isn't only about that. The whole Shift-Left mentality comes in to play even as far as to say SDET teams (or their ilk) should be writing tests against APIs as soon as the contracts are decided. Possibly even finishing before dev work starts, but definitely finishing before dev work ends. TDD absolutely includes testing "integration" and "acceptance". Though generally when our audience is the frequent dev working on the middle tier, TDD is taken to mean Unit Testing before writing Domain code.
Thoughts?
3
-
3
-
3
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
@giovani5586 I can understand that as some people's viewpoint, which is why I said it depends on what your ratio is. But to me, I'd rephrase your statement to something like this:
Operations, Planning, Security, Reliability Engineering, Data Architecture, User Experience, Quality Engineering, Software Engineering (etc etc) are all *Information Technology* subjects.
Quality Engineering is not Software Engineering, though they are two disciplines that work together.
IT Operations is not Software Engineering, though it is required for Software to run.
Site Reliability is not Software Engineering, though it is the latest role for companies following best practices.
etc etc
As a 20 year Software Engineer (and "Developer" before that), I could fill multiple full time [jobs, roles, contracts] purely focusing on the principles and practices that elevate Software Development into an Engineering mindset. TLDR; How do we write code to do this thing vs. Why do we write code to do this thing.
In the context of this video (and Cloud Computing) which is an Operations/SRE concern, Software Engineering is absolutely required. Because (Purely in relation to code) we need to make choices on how we abstract our work, how we decouple different parts of the system, build stateless systems, make sure we aren't building "Distributed Monoliths", and all kinds of Code related concerns.
So I can see some argument or some ratio toward including those other *Information Technology* roles and conflating them with "Software Engineering", but I personally set my ratio to a "Software Engineer" focusing on Operations and hardware like 10% simply to overlap with those roles in a DevOps mentality, and maybe 50% focused on coding, and plenty more room for other things not discussed here.
2
-
2
-
You are correct in some ways. Though I'll point you to the publications of most expert Software Engineers, including almost every big name out there. Dave being one of them in his book, Mark Seemann, Uncle Bob, etc...
They make it fairly clear that Software Engineering, while being a discipline that requires scientific reasoning, logic, measurements, and so forth, is different from other engineering disciplines. For almost the exact reason you point out.
Physical engineering disciplines don't have the luxury of tearing down their bridges, infrastructure, buildings, and blowing up cars over and over. It is violent and cost prohibitive.
But I guarantee you... if they could, they would. Any engineer (Physical or Digital) who has experience on both sides are extraordinarily envious that Software Engineering has this possibility to crash our proverbial cars millions of times over while attempting different implementations until they get exactly the best implementation.
Our industry is in a unique position where we CAN do that. If your main experience is outside of such luxury, I can see how it would look insane. But if you give in to the absolutely absurd power it gives us digitally, you can see the value of running hundreds of tests millions of times over, constantly, in an iterative approach to improve your code until it is finally ready for production.
2
-
When I consult for companies, I make it an absolute priority to touch on both Conway's Law, as well as the fact that "Agile" is a set of principles surrounding the SDLC. SD, obvious to us, means Software Development. Systems like "sAFE" are business processes that strive to hook business processes (not Software Development) into a hopefully "Agile"-Principled SDLC. If you aren't first Agile in your SDLC, then you can't claim to have Business Processes that "Hook into your Agile SDLC". A business can't be Agile. But as per one of the Agile Principles - They can support the Software Development team, give them the tools they need, and trust them. As they step back, the SD team needs to also provide those hooks where the business can be looped in, especially for demos, and possibly velocity reports if you subscribe to that.
2
-
2
-
2
-
2
-
2
-
1
-
1
-
1
-
1
-
1
-
Those are very important questions - But they are questions regarding Integration Testing. Which includes almost everything (performance, acceptance, etc) that is affected by factors outside of your personal code. Unit Testing is a hyper-focused topic specifically about the (typically Functional) pure code you write. Algorithms, Business Rules/Logic, etc.
This video has such a focus, on Developer "Unit Testing" -- by following TDD. Worrying about different databases, OS's, and so forth don't affect this level of focus.
But your question is absolutely valid for the more meta definition of TDD, which includes a whole "Shift-Left" mentality, SDETs, and more. It's just not what this video is about.
1
-
@d3stinYwOw That's exactly right. But again, that isn't Unit Testing. If your test behaves differently depending on configuration, or where it ran, it isn't a Unit Test. Therefore it isn't related to this video's points. If the tests rely on tool versions, access rights to endpoints, databases, or any other system it "integrates" with, then it's an Integration Test. That includes most UI testing, Acceptance Testing, Regression Testing, Performance Testing, etc etc etc.
I had to point out in a different comment that when I say "UI Testing isn't Unit Testing", I am only talking about those UI tests that are Integration Tests. There are plenty of UI Services and Behaviors that can be tested in such a way that they don't hit external factors.
1
-
Dave had a narrow view in this video, but I think he kept it on topic. TDD is a huge concept which includes a lot of areas of testing. But to "The Developer", which this video is targetting, it overwhelmingly means [UNIT] Testing. That is almost exclusively testing Functional code, algorithms, etc. If your testing is expanding to cover graphics, UX, database interactions, or any testing that ends up involving any measurable integration between systems, those are Integration Tests. STILL PART of TDD in a wider scope... but that frequently includes tests written by SDETs, QA, Integrations Teams, Infra, etc.
Video games do have a very high number of algorithms and functional code. And even if Dave (et al) don't directly agree that Unit Testing is purely for Functional code, I think they would agree there is some fairly obvious line somewhere where testing is no longer Unit Testing if some amount of framework or integrations are coming under fire.
So you may be right on the macro level that TDD (including UX testing and such) is nearly impossible for things like Video Games. But at the scope this video is bounding itself to (Unit Testing), the concepts may still be applicable.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@ContinuousDelivery Yes indeed. I agree, and that scenario shows how things can vary. I did say a couple times that each person will have a different ratio in their definition of "Software Engineering", which includes how much they lean toward the Code side, versus how much they overlap with Process, Design, Infra, etc. I'm sure in some companies, Software Engineers purely do code, while in others, Software Engineer is defined such that code isn't even on their mind, and everything between.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Hey Dave! Firstly, 100% on your list of common arguments. But I would propose that developers "hate TDD" almost exclusively because of bad management. Managers are the ones arguing the points you lay out, and they do so with rank authority. I am describing a complicated reality. Some management, frequently as high as executives, would rather hire a more compliant and quiet developer than listen to "some idealistic engineer" constantly pushing for TDD (or other engineering principles/processes like CICD). There are vastly varying personalities and backgrounds for each manager. But they do wax authoritarian and waterfall. ((( This doesn't mean engineers shouldn't use your tips to push for improvement. They Should!!! )))
In an upcoming speaking engagement, I am going to be reviewing how novice "workers" (devs) have certain personalities, and they either grow into a Managers, or into an Engineers. I would debate that your video is leaning too heavy in trying to convince those "workers" with engineering mindsets. I contend that they all already agree. The real target audience needs to be Managers, or (or novice workers with managerial/business personalities who will one day grow into managers).
This video has value, but our whole industry (Software Engineering) needs a bigger initiative demanding that trust and respect be given to engineers with the expertise to make these decisions. It's a huge initiative that dwarfs the constant repetitively redundant echo of experienced engineers like yourself writing articles, books, blogs, vlogs, videos, and it never goes anywhere because we all already agree.
In another (agile?) video, you said something about how managers make bad decisions because they don't trust their developers. The solution to that isn't to just blindly hire anybody on the street and trust them, but to be more diligent in your hiring, as well as be explicit with how you implement continuous education and training for developers toward a trustworthy engineering mindset.
TLDR; You (we) should focus on that, building systems that demand (through proof and voice, not force) that management trusts their engineers.
TLDR TLDR; We demand that managers trust their engineers.
1
-
1
-
1
-
1
-
1