In Natural Language Processing (NLP) the technique of Dependency Parsing has been used for many years. It is an area of ongoing research.
Using Part of Speech (POS) tagging, it helps in part break up a sentence to determine ‘what relates to what’ to define syntactic root structures (heads and dependents). Along with other techniques it can support meaning representations in sentences, an interesting paper Here from Stanford.
During 2018 I have been applying these techniques to Geoscience text. In particular, comparing performance (accuracy) of dependency parsing against standard ‘fixed word window’ techniques in order to derive concept and entity associations (co-occurrences).
I shall be sharing the results in the New Year!
Sounds good Paul. FYI I am running a 1 semester module on NLP in London in the New Year. This would make a great topic for a guest lecture 🙂
We’ll just have to wait!