Skip to content
Permalink
master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Go to file
 
 
Cannot retrieve contributors at this time
\section{Discussion}\label{sec:Conclusion}
We can now calculate focus time of an event-query with the help of semantic space association among words concluded from Word2Vec. The prediction accuracy of event-queries is better than the state of art benchmarks. The training time is much lesser than the previous methods. We also find terms that not present in event description but are semantically similar to the event. Now the dataset contains coverage from the publishing year 1991 to the year 2010. With the help of pseudo-relevance, the queries outside the coverage are also working better than the current state of art baselines. The reason being, in graph-based methods, co-occurrence and other pure statistical measures are calculated blindly in order to find out the relation between a temporal unit and a word present in the corpus. In recurrent neural networks, used in Word2Vec architecture, an internal memory is used to process arbitrary sequences of long inputs. They were introduced to learn distributed representations of structure, such as logical terms. That's why, at the end of vectorization process, we get contextually similar terms. So, even if there are only references of past events happened in years outside coverage, the terms which have occurred in the context of a particular query are interrelated in semantic space (may not be statistically correlated, in terms of co occurrence frequency). So combining the contextually similar terms with year vectors results in better representations of an event, which concludes the correct focus time of a query.