Last October I started to study Applied Computer Science in Berlin. I already had lot of experience since I taught myself coding and worked since July for a larger company as a software developer.
I thought I could still learn some things at my university but that wasn't the case. The only time I was at my programming lessons, I did not learn anything, but wanted to hit my head on the table. Me, as a more experienced student know that a lot of that what we got told was old-fashioned or simply wrong.
If you also started studying computer science and you love to develop software, I have some tips for you how you easily can improve you skills:
I do not say going to your courses does not make sense especially if you never wrote code before. But do not only to this. To become a really good programmer you have to do a lot of work. Its an ongoing, never ending process. And that is what made me so interested in writing code: you will never stop learning.
Personally, I cannot share that experience.
Firstly:
Students should always learn on their own! They cannot teach you all the bleeding edge stuff, due to the pace of change. But all the old rock-solid core concepts - which are very very important.
One shouldn't forget: the purpose of the first semesters is building a common foundation for all students. Already experienced people may think of those lectures as boring or not well taught.But that isn't always the case.
A US college examined the impact of prior experience to the grades in basic programming courses. Often, those already experienced people don't get the best grades, because they aren't open anymore to learn the basic concepts again and stick to their knowledge without questioning or resharpening it!
As I was in your situation, I've visited the lectures and helped my teachers with good questions and explanations, helped others when they stuck on assignments and so on. So I resharpened and clarified my knowledge and helped others to learn.
Secondly:
I saw many fellow students complaining about stuff they would 'never use in the real world'. They always said, they would simply forget, since it has nothing to do with the job of a programmer. Take 'Theoretical Computer Science', 'IT software project management' or 'Core Competence Communication' for instance.
Today we all often talk about, how those 'boring' and 'unpractical' stuff has changed our way of thinking and problem-solving. We didn't even notice until now!
I think, what I want to say is:
Don't lose curiosity and attention, even in boring lectures! Make them better with the knowledge you have.
And after two semesters, the interesting stuff is still ahead ;-)
Enjoy it!
Wow, you made me look at things from a new perspective. I have been to two different universities simply because my family move a lot. So I had a lot of experience about my first few programming classes that by my third university I found it completel boring and felt like the professors weren't doing a good job of teaching.(I had to keep taking my major classes each time again) but this is indeed an eye opener. Thank you!
I am currently studying at university of applied sciences Salzburg and can't share your experience.
We learned a lot of different technologies and concepts and even had an internship during our studies. While working on real products I only recieved positive feedback and I could even bring in some new ideas how to improve the existing codebase.
Sorry to hear that you got frustrated at your university.. Maybe you can give them some input how to improve their courses.
I learn to program with 3 main different ways:
Well, if you really want to learn algorithms, how to build AI applications and Machine Learning, Statistics, and so one... go to college!
If you just want to find a job as enterprise developer and create apps, learn SQL and other languages such as Java, C#, Swift or JavaScript (+ HTML5, CSS) and go for it!
You can find different ways to learn, but if you really want to understand computation and are looking for a deep knowledge of how a computer works and what it can do such as robotics, electronics IoT, Virtua Reality, DNA computing, networking, NLP, parallel computing, games, and so on... go to College, to understand what Computer "Science" really is about!
I think it depends on how the professors are involved... Many of them only care in their researches and not in what students must to learn when they graduate.
In my college the view is not different as you described, but I realized what you said, and I hope to realized early. Then I got the oportunity to become a professor in my college so I told the same words, and may be more, to my students.
You should really provide examples of what is "wrong" and explain why you think it is so. Just making that claim without giving any examples doesn't help anyone.