With about 12 years in my primary language I'd say my expertise is expressed in knowing exactly what to Google..
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
This is probably the true highest level of expertise you'll get out of most professional coders.
It takes a real monk level of confinement to understanding the language to break out of being proficient in looking shit up and start being proficient in being the person that writes the shit people are looking up.
I've learned a lot by breaking things. By making mistakes and watching other people make mistakes. I've writing some blog posts that make me look real smart.
But mostly just bang code together until it works. Run tests and perf stuff until it looks good. It's time. I have the time to write it up. And check back on what was really happening.
But I still mostly learn by suffering.
But I still mostly learn by suffering.
That resonates so much. Almost every time someone is deeply impressed with something I know, it brings back a painful memory of how I learned it.
I really like brain twisters. It can get frustrating at times, but it's the most fun out of the profession to me.
Knowing the footguns in your language is always useful. The more you know, the less you’ll shoot your foot.
I think that one of my issue is that I'd like to be more knowledgeable to the smaller bits and bytes of C, but I don't have the time at work to go deeper and I don't have any free time because I have young kids.
I don't have any free time because I have young kids.
That's a healthy thing to acknowledge.
It's a brutal phase for professional development, hobbies, free time, sex, basic housekeeping...
It gets better as the little ones grow.
At least, we know emotionally that it will get better with the second one haha, even if the day to day is rought.
With the first one, it felt like we would never get to the other side of it. But we did and we will for the second one.
I am eager to learn new things, so having so little free time is definitely tough. And the lack of sleep/energy makes it even harder.
Thanks for the encouragement, it's nice to be acknowledged by someone else that went through the same thing. We often forget that we are not alone and a lot of people got through it before us.
I don’t know about your workplace, but if at all possible I would try to find time between tasks to spend on learning. If your company doesn’t have a policy where it is clear that employees have the freedom to learn during company time, try to underestimate your own velocity even more and use the time it leaves for learning.
About 10 years ago I worked for a company where I was performing quite well. Since that meant I finished my tasks early, I could have taken on even more tasks. But I didn’t really tell our scrum master when I finished early. Instead I spent the time learning, and also refactoring code to help me become more productive. This added up, and my efficiency only increased more, until at some point I only needed one or two days to complete a week’s sprint. I didn’t waste my time, but I used it to pick up more architectural stuff on the side, while always learning on the job.
I’ll admit that when I started this route, I already had a bunch of experience under my belt, and this may not be feasible if you have managers breathing down your neck all the time. But the point is, if you play it smart you can use company time to improve yourself and they may even appreciate you for it.
There’s a lot to talk about from this point alone, but I’ll be brief: having gone through university courses on processor design and cutting my teeth on fighting people for a single bit in memory, I’m probably a lot more comfortable with that minutia than most; having written my first few lines of C in 10 years to demo a basic memory safety bug just an hour ago, you’re way way ahead of me.
There are different ways to learn and gain experience and each path will train us in different skills. Then we build teams around that diversity.
how do you rate yourself in your most used language?
I know things that no human should have to carry the knowledge of
Do you understand the subtilities and the nuance of your language?
My soul is scarred by the nuanced minutia of many an RFC.
in the hope of reducing my imposter syndrome.
There's but two types in software - those who have lived to see too much...and those who haven't...yet.
After almost 12~15 years of programming in C and C++, I would give myself a solid "still don't know enough" out of 10.
After almost 12~15 years of programming in C and C++, I would give myself a solid "still don't know enough" out of 10.
That resonates so thoroughly.
And while it can 100% also be the case in any tool or language, it's somehow 300% true for C and C++.
In C in particular, you have to be very cognizant of the tricky ways the language can screw you with UB. You might want to try some verification tools like Frama-C, use UB sanitizers, enable all the compiler warnings and traps that you can, etc. Other than that, I think using too many obscure features of a language is an antipattern. Just stick with the idioms that you see in other code. Take reviewer comments on board, and write lots of code so you come to feel fluent.
Added: the MISRA C guidelines for embedded C tell you to stay with a relatively safe subset of the language. They are mostly wise, so you might want to use them.
Added: is your issue with C or with machine code? If you're programming small MCUs, then yes, you should develop some familiarity with machine code and hardware level programming. That may also help you get more comfortable with C.
Yeah, but they make me MISRAble.
If you step in enough shit you eventually learn to realise when you are about to step in it again. I think the most knowledgeable people are those that have failed the most and found something helpful along the way, seems you are well on your journey so just keep steeping. At some point the abstractions you have control over become unreliable until you understand how they interact with lower level systems and the balance of control comes back because you know know the circumstances in which these abstractions work in your favour.
A one out of ten. I consider myself the world's second worst programmer.
By any chance, do you use a niche language that has only two programmers?
Nope. I'm just that bad. I feel like I have a logical mind but it just seems like the command don't do what I think they will, won't operate on a certain type of variable or Holy crap I forgot a friggin space or semi-colon or something.
Languages in order of proficiency: C++ HTML/CSS Matlab Basic Fortran (1 class taken)
But when I say proficient I seriously mean looking stuff up on the internet for every single line. And I haven't used Basic in decades.
Who is the first?
The guy who was using my name to make code submissions 2-3 years prior.
Odds are the worst one is still using Twitter.
Better than many, mediocre.
With my coworkers I've got a strange ability to pick up any language that tastes like c, and get stuff done. I'm sure I've confused our c# guys when I make a change to their code and ask for a code review, because I'll chase down quality of life improvements for myself. (Generally, I will make the change and ask if I have any unintended side effects, because in an MCU, I know what all my side effects are, multi threaded application?, not at all)
Edit: coming from a firmware view, I've made enough mistakes to realize when order of operations will stab me, when a branch is bad because that pipeline hit will hurt, and I still get & vs && wrong more often than I would like to admit.
I just have to say "tastes like c" is a visceral way to say it. I approve.
I've been writing code for 25+ years, and in tech for 27+.
I'm a novice at all languages still. Even though they tell me I'm a Principal Engineer.
There's always some new technique or way to do what I want that's better I'm learning every day. It never stops. The expectations for what I consider to be good code just continues to climb every day.
I should know more about what's happening under the hood.
You've just identified the most important skill of any software developer, IMO.
The three most valuable topics I learned in college were OS design basics, assembly language, and algorithms. They're universal, and once you have a grasp on those, a lot off programming language specifics become fairly transparent.
An area where those don't help are paradigm specifics: there's theory behind functional programming and OO programming which, if you don't understand, won't impeded you from writing in that language, but will almost certainly result in really bad code. And, depending on your focus, it can be necessary to have domain knowledge: financial, networking, graphics.
But for what you're taking about, those three topics cover most of what you need to intuit how languages do what they do - and, especially C, because it's only slightly higher level than assembly.
Assembly informs CPU architecture and operations. If you understand that, you mostly understand how CPUs work, as much as you need to to be a programmer.
OS design informs how various hardware components interact, again, enough to understand what higher level languages are doing.
Algorithms... well, you can derive algorithms from assembly, but a lot of smart people have already done a ton of work in the field, and it's silly to try to redo that work. And, units you're very special, you probably won't do as good a job as they've done.
Once you have those, all languages are just syntactic sugar. Sure, the JVM has peculiarities in how its garbage collection works; you tend to learn that sort of stuff from experience. But a hash table is a hash table in any language, and they all have to deal with the same fundamental issues of hash tables: hashing, conflict resolution, and space allocation. There are no short cuts.
After 6 years of seriously using Python regularly, I'd probably give myself a 6/10. I feel comfortable with best practices and making informed design decisions. I have no problem using linting and testing tools. And I've contributed to large open source projects. I could improve a lot by learning more about the standard library and some core computer science concepts that inform the design of the language. I'm pretty weak in web frameworks too, unfortunately.
After 3-4 years of using python I'm bumping you up to a 7 so I can fit in at a 5. Congrats on your upgrade. I've never contributed to open source but I've fixed issues in publocly archived tools so that they aren't buggy for my team. I can see errors and know what likely caused them and my code literacy is decent. That being said, I think I'm far from advanced.
8/10 Server-side JavaScript
7/10 Ampscript
3/10 SQL
There is something about SQL that I can't get to click with me. I can run basic queries and aggregation, but I can never get nested queries to work right.
All of these also assume I have access to documentation. Without documentation, all of them are like a 2. 🤷
I have advice that you didn't ask for at all!
SQL's declarative ordering annoys me too. In most languages you order things based on when you want them to happen, SQL doesn't work like that- you need to order query dyntax based on where that bit goes according to the rules of SQL. It's meant to aid readability, some people like it a lot,but for me it's just a bunch of extra rules to remember.
Anyway, for nested expressions, I think CTEs make stuff a lot easier, and SQL query optimisers mean you probably shouldn't have to worry about performance.
I.e. instead of:
SELECT
one.col_a,
two.col_b
FROM one
LEFT JOIN
(SELECT * FROM somewhere WHERE something) as two
ON one.x = two.x
you can do this:
WITH two as (
SELECT * FROM somewhere
WHERE something
)
SELECT
one.col_a,
two.col_b
FROM one
LEFT JOIN two
ON one.x = two.x
Especially when things are a little gnarly with lots of nested CTEs, this style makes stuff a tonne easier to reason with.
I'm 100% going to try this, but I have a feeling that it isn't going to work in my application. Salesforce Marketing Cloud uses some pared-down old version of Transact-SQL and about half of the functions you'd expect to work just flat out don't.
The joys of using a Salesforce product.
Oh boy, have fun! CTEs have pretty wide support, so you might be in luck (well at least in that respect, in all other cases you're still using saleforce amd my commiserations are with you)
Salesforce just gives me the other kind of CTE.
I loathe debugging ampscript and anything to do with marketing cloud with a passion..
What helped me a lot with pushing deeper down into the language innards is to have people to explain things to.
Last week, for example, one of our students asked what closures are.
Explaining that was no problem, I was also able to differentiate them from function pointers, but then she asked what in Rust the traits/interfaces Fn
, FnMut
and FnOnce
did (which are implemented by different closures).
And yep, she struck right into a blank spot of my knowledge with that.
I have enough of an idea of them to just fill in something and let the compiler tell me off when I did it wrong.
Even when designing an API, I've worked out that you should start with an FnOnce
and only progress to FnMut
, then Fn
and then a function pointer, as the compiler shouts at you (basically they're more specific and more restrictive for what the implementer of the closure is allowed to do).
But yeah, these rules of thumb just don't suffice for an actual explanation.
I couldn't tell you why these different traits are necessary or what the precise differences are.
So, we've been learning about them together and I have a much better understanding now.
Even in terms of closures in general (independent of the language), where I thought I had a pretty good idea, I had the epiphany that closures have two ways of providing parameters, one for the implementer (captured out of the context) and one for the caller (parameter list).
Obviously, I was aware of that on some level, as I had been using it plenty times, but I never had as clear of an idea of it before.
The more I learn about my language the less I think it matters. Maybe in embedded C you can’t just leave everything to the compiler though.
A solid 5.
I'm happy with it too. They still pay me so I must be doing something right. Almost two decades now.
5 years professionally and I can find jobs, so yeah I must do something decent. But that imposter syndrome is strong these las weeks
I am very proficient in my primary language, C#.
Writing more context out feels like boasting, so I think I will skip that and go to a summation/conclusion directly.
Knowledge and expertise comes from more than the language. Which you hinted at. The language is only our interface. How is the language represented, how will it transform the code, how will it be run. There's a lot of depth in there - much more than there is in the language itself.
I learned a lot, through my own studies and reading, studying, projects, and experience. I'm a strong systematic thinker. It all helps me in interpreting and thinking about wide- and depth- context and concerns. I also think my strengths come at the cost of other things, at least in my particular case.
You're not alone. Most developers do not have the depth or wide knowledge. And most [consequently] struggle to or are oblivious to many concerns and opportunities, and to intuitively or quickly understand and follow such information.
Which does not necessarily mean they're not productive or useful.
Through the different replies, I reflected on what I know and what I do for work and I feel like my skillset is more akin to a generalist/integrator, which is needed. But I also feel like everyone in my domain does that. Which might or might not be true.
I guess knowing our strengths and weaknesses is also a skill in itself and a little bit of self doubt here and there can help us grow and direct our knowledge in a certain direction.
Thanks for the insight.
I would give myself a solid 4.2/5 on python.
- I have in deepth knowledge of more than a few popular libraries including flask, django, marshmallow, typer, sqlalchemy, pandas, numpy, and many more.
- I have authored a few libraries.
- I have been keeping up with PEPs, and sometimes offered my feedback.
- I have knowledge of the internals of development tooling, including mypy, pylint, black, and a pycharm plugin I have created.
I wouldn't give myself a 5/5 since I would consider that an attainable level of expertise, with maybe a few expections around the globe. IMO the fun part of being really good at something is that you understand there still is to learn ❤️
novice still learning everyday
I’m mostly working in Java now. I’m proficient to the degree that I can solve most things without looking for reference online. I think that matters most to me.
I've been using c# since .net 2 which came out around the turn of the century (lol)
I'd happily call myself an expert. I can do anything I need to and easily dive into the standard library source code or even IL when needed.
But even then there are topics I could easily learn more on particularly the very performance focused struct features and intrinsics.
I've found LLMs to be super useful when you have a very specific question about a feature. I use bing ai at work so it sources all its answers and you can dive into the articles for more detail.
Programming is a never ending learning journey and you just have to keep going. When you get something you don't fully understand to a deep dive there are always resources for everything.
Good enough to make my own things or mod things.
But not good enough to get a job as a programmer.
But not good enough to get a job as a programmer.
This is as weird of a time for getting hired as a programmer as we have ever had. Hang in there. Once we let AI deployment pipelines start causing production outages and shareholder bankruptcies, we will start falling over ourselves to hire human programmers again.