Researchers from Egypt to Cambridge are working on “affective programming” for computers. They are basically teaching computers how to interpret human emotions by sensing changes in facial structure. Computers have had problems differentiating between subtle changes in the past, so now they are entering huge volumes of information in order for a more accurate understanding. This programming has promising goals of helping people who suffer communication disabilities and GPS systems who know when they are being annoying. Just like humans though, misinterpretations are still liable to happen. Also, not everyone thinks that teaching computers to understand us is a good idea. Read full article here But How Do You Really Feel? Someday the Computer May Know