I'm about to blow your mind, because you've probably never heard someone say this in all of your indoctrination, erh, ahem, I mean training. Object Oriented Programming is wrong.
The fundamental tennets of Object Oriented design methodology are fallacious. The promise of OOP is that anthropomorphizing data will make it easier for programmers to grasp the data interrelations and interactions, but it just muddies the water for how that data is processed. The class does not exist. It really is just a bag of data--your CDog instance is not a dog, it's electrical signals that represent numbers that represent data that represent a dog--and to treat it otherwise is to peddle in lies.
OOP has done nothing to decrease the complexity of application development, nor has it led to more successful projects (success measured in terms of coming in on-budget). And if you look at the OOP advocated by such ecosystems as Java and Struts, you'll even see a massive increase in complexity.
In any enterprise scenario, the most important thing is data. An enterprise application is nothing without a solid database, and the database will live on long past the application itself. That's the fallacy of N-Tier and MVC architectures: that the database is somehow an integral part of the application. How many times have you worked on new intranet applications that were connecting to and manipulating old databases, be them ancient mainframe systems or over-leveraged MS Access debacles? We'll be writing programs in C# 7.0 before we completely migrate off of dBase II. That's why COBOL applications still exist, the data storage was integrated into the programming language. And in an ever more regulated business environment, our requirements for data are only going to increase exponentially.
But what changes constantly and dramatically is how we want to see that data. That is our application. The means by which we manipulate data are absolutely decoupled from the data itself. In 20 years time, you're not going to want to know how report XYZ was calculated (in some part because it was probably wrong), mostly because your business needs are completely different and that report just doesn't make sense anymore.
By using OOP, you're marrying your data and your behaviors, by its very definition, and then trying to hide your data. It's like playing 20 questions with your business. When some new business analyst comes along, trying to make a name for himself in the company, and declares that System XYZ that has been running perfectly well for the last decade needs to be rewritten "to bring it up to industry best practices", in an OO scenario you're in the old cliche of "throwing out the baby with the bathwater." This is especially true if you've used an ORM system that lets the application drive the database design.
This is probably why there aren't very many jobs for Lisp and other functional language programmers: because their languages were pretty much right the first time so there is no pressure to "upgrade" and their applications just work the first time because they know the difference between data and how we manipulate the data. In functional programming languages, EVERYTHING is understood to be just bags of data. In Lisp and Scheme, you pass around lists to all of your functions, lists that don't advertise particularly well what are in them. That's because, for the most part, it doesn't matter. You don't want your application to know very much about your data. You want to segregate those parts of the application that do know about the data to very restricted and limited areas. The more generic the bulk of the code, the more reusable it is.
Which brings me to yet another fallacy of OOP: code reuse. When was the last time you reused a class you wrote yourself? I bet you have a "Utility" class somewhere that's a bag of static methods, we all do, because OOP doesn't account for the fact that functionality exists separate from data. That doesn't count, that's not a real class. I'm talking about BusinessEntityWidgetX implementing GodAwfulInterfaceY. When was the last time you reused either across applications?
So OOP, you promised the world, and you never delivered. Then you built a system of indoctrination around yourself (university computer science programs) to protect yourself from criticism. Congratulations, you've achieved the status of religion.