Programming languages and where to go
I've been programming for ~2 years now. I find myself able to do small Java Swing applications that do something fun. Recently I wrote a ClassDiagram creator that reads a C++ header file and outputs the Diagram as a jpg.
I also am currently doing a 2-man project on a game with Java and libgdx(opengl for java). This is not very difficult for me.
However, I am now wondering if I should keep developing in Java or if I should switch to C++ or even some of the script languages especially in hindsight of getting a job in this industry.
I guess the root of my question is: When does switching from Java to C++ as a "main" language make sense?
It makes sense when it makes sense. Remember, Programming languages are just tools. In the end, no one is going to care if you used Java, PHP, Ruby, C++, C#, Assembly, etc... They're going to care if you're providing them what they need.
That said, the answer depends. For example, if you're not going to use C++ in your lifetime, why switch?
Now, if you just want to learn the language, then that is a use, (basically an intellectual investment) but that said it just depends when you want to learn it.
July 12th, 2013, 06:43 PM
I've been a developer for awhile, and I've seen many job opportunities. Frankly, I'd say Java is a better bet. C++ is very popular in some industries (especially the gaming industry), but if you're proficient in Java, then you really won't find it too difficult to find a job somewhere. This is just what I've seen based on many conversations with recruiters over the years. That being said, C++ is still used very widely, so I would imagine it's still a safe bet if you enjoy it more.
Also, if you know Java, you can pick up Android mobile development as a plus.
July 12th, 2013, 07:05 PM
Getting a job isn't about quality, unless you want a wizardly job -- but in that case you're more likely to start your own company (and then get bought, thereby finding your boss) than get hired somewhere. Focus on buzzword and garbage tech -- "Worse is Better" in the broader industry, and if you're just trying to get hired you should definitely stick with Java.
Lots of companies don't know jack about software but need programmers and some manager heard "Java is l33t" 20 years ago when he wanted to be a haX0r -- and so they dictate, for completely non-technical reasons, that everything they ever do be in Java (I'm not joking -- I've actually seen this more than once). In addition to the influence of ancient market buzz on individuals, lots of companies got caught up in Java fever and they still have enormous amounts of legacy code in Java they can't part with, possibly forever. So they will always need Java programmers to maintain that stuff, and there are a ton of decent, relatively easy jobs based on this situation alone.
If you want to inflict (further) brain damage on yourself in exchange for bags of money learning COBOL is also a good way to get a well paying job, for much the same reason Java is -- someone heard it was a good idea, tons of stuff was written in it, and now its entrenched across scads of deep-pocketed organizations who will forever need maintainers for their legacy code.
Once you get hired, though... start educating yourself on more interesting things (after you get a grip on whatever libraries you need for your day job!) and let your interests take over -- they may eventually guide you in a really exciting direction that is fun and pays.
July 13th, 2013, 01:47 PM
ha, I can relate to this. I did COBOL for awhile before they switched me onto Java projects. However, I'm not sure I agree with the concept that most companies just went with Java or other languages because it was the "cool" thing to do. Most larger (and even smaller) tech companies do VAST amounts of research before allowing new technologies in place and making sure that it's a right fit for their needs.
Originally Posted by zxq9
That being said, I have indeed seen instances of using one-off technologies because they are really cool or "shiny new toy" syndrome. The problem nowadays with doing that is that there is little support for them (via StackOverflow, etc), while technologies like Java, C#, and C++, there is just mounds and mounds of information out there to overcome any possible issue you may have. There is a lot to be said about that.
July 13th, 2013, 11:40 PM
Originally Posted by BlastPort
A primary driving force behind commercial adoption of Java was that colleges were teaching it. Colleges adopted it as part of a history of reducing compsci education standards as a reaction to the "you won't get a government grant for X, Y, or Z if you don't produce more IT grads" during the US education panic days (still ongoing, to some degree). Quota X of students couldn't pass their electrical engineering prereqs, so that got dropped, same for the design of structure type courses. Without those they couldn't pass their initial assembler, C or compiler courses. So that whole block got replaced with Java and along with it OOP.
Which pushed OOP into everything because Java is a Bondage&Discipline language that forces a specific paradigm on the programmer, whether or not he has any idea what an "object" is, or the significant differences between what the term "object" means in the literature VS what it means in Java.
That move came as a reaction, not a thoughtful initiation. Sun was pushing its Java concept hard in industry and education at the same time -- mostly to people who were either desperate (universities starving for grant money) or didn't understand the arguments involved (hence the "portability!" cry -- for a runtime which is still less portable than C++/Qt or Python...).
This is totally true, and why I wouldn't recommend Haskell for long-duration code (but strongly recommend it for writing programs that write programs in C for you). The situation is getting better with acceptance of some more interesting technologies in major projects, but most of the really interesting stuff is internal to large, specialized development groups operating within tech companies that self-educate on a schedule instead of broadly known concepts that the average fresh graduate or web enthusiast is aware of.
July 13th, 2013, 11:58 PM
I'm not exactly sure what kind of companies to which you're referring, but I've worked for a couple of very large corporations (50-100k employees), and they are incredibly careful about which technologies are allowed and disallowed; there's really no disputing that. Actually, the last corporation for which I worked I believe is still on Windows XP/Office 2003, with the caution of not upgrading for their employees until it's 110% safe to do so, for security's sake.
Originally Posted by zxq9
On the programming side of things, I've seen vast amounts of C code compared to vast amounts of Java code, all of which has been in "support mode" for over a decade (yes I know both languages very well), and to make the claim that the sustainability between the two is even remotely comparable is a joke. Java is by far the better choice for the architecture of business logic and systems, regardless of the "initial reasons" it was adopted by companies.
C# is an alternative, as the .NET world is absolutely superior in many ways to the Java world. But taking 5 steps back and implying that we should all revert to languages like C to do our business logic is frankly a bit frightening.. lol
I don't mean to argue, but I just am not hearing any good reasons to steer clear of the OO benefits of languages like Java and C#. Especially with them being prevalent in the mobile world (Android and Windows Mobile), they are not going anywhere anytime soon either.
July 14th, 2013, 01:18 AM
Watch the marketing video I linked. Its quite interesting (a sort of roundtable discussion about the way processor marketing went in the 80's amongst the now-retired key participants involved hosted by the Computer History Museum). The size of a company is not the measure of its likelihood to make mind-blowingly huge tech mistakes.
Originally Posted by BlastPort
The runtime facilities made available by Java (abstracted data types and garbage collection) make it a nicer environment in some ways, but to say that writing any code in Java makes it inherently easier to maintain than any code in C is false. Clean coding is clean coding, regardless the language. They are two tools for very different jobs. You can't write a very effective Java runtime in Java, for example -- those are written in C and assembler for the most part. On the other hand I wouldn't dream of writing a UI in C from scratch -- but its not too hard in Java.
And by far the worse choice when weighed against a properly designed relational system (where the rules are naturally emergent from the inherent structure and rules of the data itself). A proper data model coupled with a functionally designed service which can render complete answers to any connecting system is ideal.
But how many business systems have you seen where a database is thought of (or even referred to outright as) a "persistence layer" and objects have been created ad-nauseam to represent different concepts, each referring to an ever-growing stock of other objects that represent rules? This is a very common case, and in every case it is an attempt to re-solve the relational data problem without ever recognizing the problem. This occurs because either OOP was the only tool that was in the initial architect's toolbox at the time the system was designed or because the original designer knew that it would be too difficult to find future maintainers who understood how things work, so a baroque OOP-only design is better for being worse because it meets the lowest common denominator across the industry.
This is not at all what I said. I said the education standards have dropped precipitously -- first abandoning electrical engineering and design and then the study of architecture, compilers and C (for which students lacking the first two are especially unprepared) with it.
The fallout from that is bad. When a student learns Java and only Java that also means they learn OOP and only OOP. This is horrible because it winds up being a one-paradigm-everywhere situation -- which is stupid. In the end a huge amount of code in some projects where OOP doesn't fit is spent implementing functional or descriptive functionality into a program library to make up for the gaps inherent in the design of Java itself. Its not a bad language, but it is bad to have as an only language, but a large swathe of the commercial development treat it not just as a programmer's only needed language but as the only language in the world -- which is ridiculous.
Nothing wrong with OOP when it is used where it makes sense -- simulation -- and preferably a brand of OOP where message-passing and live-polymorphism (as opposed to strict inheritance-based polymorphism) is a native part of the system. Neither of which are quite true in the case of Java. When I say "simulation" keep in mind that business rule systems can be very effectively written this way (in most cases) and, in particular, UI designs of nearly any type are quite easy to comprehend when designed as simulations (whether the programmer realizes this is what widget libraries typically do or not is not the issue).
But back to the OP's point. He was asking what he should learn. If the goal is to get hired then definitely things like Java and COBOL are at the top of the list. But if the goal is to understand how we can make computers move to do our work for us, then there is a longer, more interesting list (with an accordingly bumpier path of progression) of things to which he may wish to turn his attention.
[EDIT]: You, and anyone else interested in this thread, may be interested in the history of [Worse is Better] as told by Richard Gabriel himself. The arguments there are between Lisp and "the MIT approach" (AKA "The Right Thing") and the C/Unix approach (AKA "New Jersey design"). The difference between the two is interesting, as are the parallels between them and the current competing technologies. It is good/bad and encouraging/sad that we've decided to dive for webtech, Java and Windows as a part of this progression -- but it is exclusively sad that very few of today's students and coders are aware of the issues discussed in Worse is Better, or even that a discussion about this exists.
Last edited by zxq9; July 14th, 2013 at 03:38 AM.