Title: Learning and Reasoning with Semantic Vectors Abstract: Learning distributional similarities from text is well-established, and has been used in many tasks including information retrieval, word sense disambiguation, lexical acquisition and translation. Different mathematical techniques are available, including singular value decomposition, latent Dirichlet analysis, and random projection, with different semantic and computational properties that make them appropriate for different tasks. However, for many years these methods could be used only to detect symmetric "similarity" relationships: due to the lack of syntactic or directed relationships, they have often been described as "bag-of-words?" methods. In the last few years, research has taken a big step further, in learning relationships beyond similarity. Linguistically, this is stimulated by taking word-order, or deeper syntactic and semantic relationships, into account. Mathematically, it is implemented by using more sophisticated operators, such as permutation or convolutions of coordinates, to represent language composition. In this talk, we will briefly survey this area, and discuss a specific application of semantic vectors to inference in the medical domain. We use semantic vectors to learn representations for different objects and their relationships, and test this model by applying it to an analogical reasoning test, of the form "A is to X, as B is to what?" Finding Schizophrenia's Prozac: Emergent Relational Similarity in Predication Space Cohen. T. Widdows, D. Schvaneveldt, RW. Rindflesch, T. Proceedings of the Fifth International Symposium on Quantum Interaction, Aberdeen, UK, 2011. http://www.puttypeg.net/papers/schizophrenias-prozac.pdf According to results to date, these methods are both robust and computationally efficient. The combination of learning and reasoning within the same model is a challenge that applies to many disciplines, and is perhaps a key ingredient in building intelligent technology.