Ottawa, June 7 - 11, 2021
Ex: what is the population of oregon ? answer ( A , ( population ( B , A ) , const ( B , stateid ( oregon ) ) ) )
Currently, neural network based models called Transformers (the engines behind Google's Translate) give state-of-the-art results on all of the benchmark sematic parsing datasets.
In this poster presentation, we will propose our own Transformer-Based Semantic Parser (TBSP) which uses a two-layered approach, each layer being a Transformer.
Our method uses a rough sketch of the parse (a 'coarse' parse) at the first layer, where the rough sketch contains an ordered list of all the logical operators, predicates, relations and constants present in the semantic parse of the inputted natural language sentence.
The second layer accepts this rough sketch as input and outputs the final semantic parse (a 'fine' parse).
We will be reporting on our model's performance on the Geo880 and its improvement over the accuracy of a one-layered TBSP.