Pixabay - binary-2175285_1280 - Andoid humanoid

Using Artificial Intelligence to create predictive systems

The term Artificial Intelligence (or AI) was coined in the mid 1950s.  AI technology was heavily funded by the Department of Defense for many years. Unfortunately, the practitioners at the time were overly optimistic and failed to overcome some of the difficulties that they faced. By the mid 1970s, funding was largely cut in favor of more promising projects.

Artificial Intelligence in the 1980s

AI re-surged in the 1980s with the commercial success of a branch called Expert Systems. But again, there were issues, and AI fell back into hibernation and had followings mostly in research institutes.

More recent Artificial Intelligence

AI came back in the 1990s with special emphasis on data mining and advanced deep data analytics. In 1997, Deep Blue became the first chess computer to beat the world champion Garry Kasparov. In the 2000s, the Defense Advanced Research Projects Agency (DARPA) developed successful autonomous vehicle programs that proved successful. The DARPA programs were known as the Grand Challenge and later Urban Challenge. The first Grand Challenge focused on navigating 131 miles of rugged terrain. There were no successful candidates. The later Urban Challenge focused on navigating a protected 55 mile track while obeying traffic laws, hazards, other vehicles, and even pedestrians. The Urban Challenge brought with it successful candidates. The programs resulted in many of the collision avoidance and autonomous vehicle solutions available in our cars today.

Applying Neural Networks to Investigate Electrical Power Plant Cooling Water Discharge Temperature (contact me for the thesis) is an exploration in using a branch of AI called Artificial Neural Networks (ANN) in data mining and forecasting. The paper presents the results of creating two different classes of Neural Networks. 

First is a data mining solution, used to identify effectual variables in a large, complex, time driven system, and the second is a predictive algorithm using the effectual variables to forecast a future event. The goals of the program were driven by the Environmental Protection Agency’s demands to protect the water habitat and animals (such as warm blooded manatees) that make a home in Tampa Bay.

Happy reading! By the way, I’d love to hear your feedback if you do decide to read.


This thesis investigates neural computing applied to electrical power plant cooling water system forecasting. The target system is a coal-fired, base-load electrical power generation facility owned and operated by the Tampa Electric Company in Apollo Beach, Florida. During the process of converting chemical energy contained in coal to heat energy and finally to electrical energy, excess heat energy is created and must be dissipated. The excess heat dissipation occurs by way of Tampa Bay, a large body of salt water directly navigable to the Gulf of Mexico.

The Environmental Protection Agency has established water temperature discharge guidelines and restrictions in order to protect the surrounding habitat. Current plant operations dictate unwritten heuristic approaches used to reduce the opportunity for temperature violations. The value in this work is twofold. First, the study attempts to uncover features that affect the water temperature as it leaves the Big Bend facility.  These features, once identified, are then used to understand the characteristics of the plant as they relate to the water temperature discharge.

Neural computing techniques are used in this study because most of the variables  exhibit interactions that are difficult to discern and categorize through other approaches. The pattern recognition facility of neural computing is especially important to this investigation. Intermediate neural network architectures are designed for proving specific feature interactions, and these models contribute to the final system architecture mostly by demonstrating preprocessing requirements.

The resulting artificial neural network architecture and associated preprocessing accommodates time delay features and produces results to within 1.0 degree Fahrenheit.  The final architecture is a Cascade Correlation hidden layer model, composed of 28 input nodes and 1 output node.