Thread: Programming AI
View Single Post
  #8  
Old 01-17-2009, 04:25 PM
Kobaz
Hill Giant
 
Join Date: Nov 2008
Location: Gold Coast, Oz
Posts: 119
Default

I worked in a faculty where there was a lot of AI research. My area of expertise is using GACA (genetic algorithm cellular automata) for modelling spatially heterogeneous dynamic systems. My colleagues worked on other problems, including chat-bot interfaces to expert systems. There were often complicated conversations over coffee while things compiled or ran where we would bounce our show-stoppers off each other. So take what I say with a large amount of salt, as I'm recalling things that were discussed in brain-storms.

The short-term/long-term memory issue can be handled by using techniques similar to those used in ARIMA time series analysis to monitor serial correlation. The maths is a lot like that used to try to predict what the stock market is doing. The idea is that all concepts have a weight factor, and that the more recent concepts have higher weight than less recent. If a concept is raised that is highly correlated to an older one then the older facts have their weights boosted by a small amount. Over the course of a conversation the topic being discussed tends to have a sufficient weight to stay at the head of a queue of topics. As in all evolutionary computing (ANN, GA, GACA, GP) getting the rate of change of the weights right is damn hard.

I know one fellow was constructing ontologies of the conversations, and was using that to maintain context. He was using multiply threaded trees, and was maintaining the trees in a DB using memory-only tables (for speed), periodically flushing to disk. He was using a B-Tree based DB as I recall.
Reply With Quote