In 1928, Webster's Dictionary defined calculator/computer as:
CALCULATOR, noun One who computes or reckons; one who estimates or considers the force and effect of causes, with a view to form a correct estimate of the effects.COMPUTER, noun One who computes; a reckoner; a calculator
Today, it defines it as:
I can't help but ponder the term "AI Programmer"
If I search for AI Programmer today; it links me to Programmers who specialize in artificial intelligence (AI). But, as we discuss the rise of the AI bots taking peoples jobs; it seems most obvious that the jobs they will take will be those of application programmers, not of doctors, or lawyers.
IBM pushes the idea that Watson went to medical school by ingesting all known information about medicine. Yet, they apparently didn't send Watson to school for computer science. Why not?
Could I hire an AI Programmer? Not a person who programs computers to do AI but an AI being that writes application programs. No, I can not; I looked; none available for hire/purchase. Can't hire an AI Data Analyst either - at least not an AI bot that analyzes whatever I want them to analyze.
Before Y2K, application programming was dead because we were all going to be business analysts using CASE tools and 4GL languages/tools in the early 90s. Yet that died with the rise of languages based on C, a systems programming language at the time, not an applications programming language. So now we have a very large amount of code supported in languages like C++, java, and C#. So, we got more techie not less techie over the last three decades.
In fact, writing applications has become more difficult over the last 30 years, not less difficult. That is the primary reason organizations with application developers have more of them today than they did 30 years ago. This is despite a post-Y2K premonition by most IT pundits that application programming was dead. After all, everyone moved off of custom applications with competitive advantages and moved onto the same packaged software as everyone else. So why would you have application programmers? The same reason they had custom enterprise software prior; to gain a competitive advantage.
Why did Watson study Medicine and not Computer Science?
So now, do I believe application programming will not die but instead the job will go to an AI program that will follow my directions and cut out the application programmer as a middleman? Well, I have to come back to why didn't IBM have Watson go to school for computer science instead of medicine. Did they just want a doctor in the family?I'm thinking that AI just isn't there yet. What people are calling artificial intelligence and machine learning (ML) are nothing more than the same old predictive analytics and decision trees we have had at our fingertips for decades. The tooling they put around it is just to compensate for the fact that we made everything harder when we moved off CASE tools, 4GL, and even business oriented 3GL languages and used systems programming syntax for application programs.
Watson is no JARVIS
The only way one can interpret this is that Computer Science is harder than Medicine. Watson went to medical school because all he has to do is provide guesses based on symptoms of what a diagnosis is likely to be. This has some value probably for real physicians in considering rare diseases - but it isn't putting anyone out of a job.
When it comes to jobs; why is Tony the only one with a JARVIS? If everyone has a JARVIS, and JARVIS has robots, what to people do? I look forward to a day when I can work alone in my mansion like Tony Stark and just tell JARVIS what I want and when I agree it looks like he did it, I say "deploy it" and it is now doing my bidding. However, the reality may look more like "Ready Player One" where everyone is poor except a very few and the masses just play games and collect welfare because apparently AI and robots do all the work.
Competitive Advantage from Technology
Until JARVIS comes along, I am building the next generation of healthcare solutions with my team that enable providers and payers by automating the hard work of healthcare data analytics. But, it isn't AI or ML. It is just good analytics practices coupled with standard interfaces and an easy to use user interface, to ease a labor intensive job, of data acquisition & shaping in healthcare, that is outside the core competency of the user (a healthcare worker). Because that is how we minimize the writing application code (analytics code) inside healthcare organizations and rapidly produce the information needed by human managers to improve outcomes. Both of those result in reduced healthcare cost and improved outcomes for the price.My role in life may be to replace IT/analytics labor at customer sites with apps developed by systems programmers at my company; but that is how technology has always created value. We replace what can be replaced with technology so that those resources can find new opportunities to create new processes. We call it retiring technical debt - replacing what had to be done as a one-off with a common appliance. But, no AI has yet to identify such opportunities and solve for them as application programmers have for decades.
Comments
Post a Comment