We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 November 2024
Reseach Article

Article:Parsing with Neural and Finite Automata Networks: A Graph Grammar Approach

by Sanjay Bhargava, G. N. Purohit
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 23 - Number 4
Year of Publication: 2011
Authors: Sanjay Bhargava, G. N. Purohit
10.5120/2878-3747

Sanjay Bhargava, G. N. Purohit . Article:Parsing with Neural and Finite Automata Networks: A Graph Grammar Approach. International Journal of Computer Applications. 23, 4 ( June 2011), 13-20. DOI=10.5120/2878-3747

@article{ 10.5120/2878-3747,
author = { Sanjay Bhargava, G. N. Purohit },
title = { Article:Parsing with Neural and Finite Automata Networks: A Graph Grammar Approach },
journal = { International Journal of Computer Applications },
issue_date = { June 2011 },
volume = { 23 },
number = { 4 },
month = { June },
year = { 2011 },
issn = { 0975-8887 },
pages = { 13-20 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume23/number4/2878-3747/ },
doi = { 10.5120/2878-3747 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:09:17.989843+05:30
%A Sanjay Bhargava
%A G. N. Purohit
%T Article:Parsing with Neural and Finite Automata Networks: A Graph Grammar Approach
%J International Journal of Computer Applications
%@ 0975-8887
%V 23
%N 4
%P 13-20
%D 2011
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Parsing with finite automata networks implies, in one way, the conversion of a regular expression into a minimal deterministic finite automaton, while parsing with neural networks involves parsing of a natural language sentence. In ‘Parsing with finite automata networks’ finite automata are frequently combined using a set of rules for various operations like union, concatenation, and kleene closure; while in ‘Parsing with neural networks’ an incremental tree is usually obtained, by using a set of rules for connecting a possible parse tree to the previously obtained incremental tree. Apparently, all the above rules that are being applied in parsing whether with finite automata networks or with neural networks belong to some graph transformation rules. These rules depict a new concerned area of grammars known as graph grammar, that is, a grammar that operates on graphs. This research paper presents a twofold investigation on the use of graph grammar as it explores an attempt to use both aspects of graph grammars (to generate a valid language and to parse a language for its validity) for parsing with (i) neural networks and (ii) finite automata networks.

References
  1. Abney, S. P. . “Stochastic attribute-value grammars”. Computational Linguistics. vol. 23, no. 4, pp. 597-618.
  2. Aho, A. V., R. Sethi, and J. D. Ullman . Compilers: Principles, Techniques and Tools. Pearson Education Asia. New Delhi.
  3. Bartsch-Spörl, B. . “Grammatical inference of graph grammars for syntactic pattern recognition”. Lecture Notes in Computer Science no. 153. Springer-Verlag, Berlin/Heidelberg, New York. pp. 1-7.
  4. Bhargava, S. and G. N. Purohit . “Construction of a minimal deterministic finite automaton from a regular expression”. International Journal of Computer Applications (IJCA). vol. 15, no. 4, pp. 16-27.
  5. Bhargava, S. and G. N. Purohit . “Parsing a Natural Language: A Non-Statistical Approach”. National Journal of Computer Science & Technology (NJCST). vol. 3, no. 1, pp. 23-33.
  6. Bod, R. . “The problem of computing the most probable tree in data-oriented parsing and stochastic tree grammars”. In Proceedings of the 7th Conference on European Chapter of the Association for Computational Linguistics. Association for Computational Linguistics, Morgan Kaufmann Publishers, San Francisco, CA. pp. 104-111.
  7. Boers, E. J. W. and H. Kuiper . “Biological metaphors and the design of modular artificial neural networks”. Master’s Thesis. Leiden University, The Netherlands.
  8. Briscoe, E., C. Grover, B. Boguraev, and J. Carroll . "A formalism and environment for the development of a large grammar of English”. In Proceedings of the 10th International Joint Conference on Artificial Intelligence (IJCAI’87), Milan, Italy. Morgan Kaufmann Publishers, San Francisco, CA, USA. pp. 703-708.
  9. Carrasco, R. C. and M. L. Forcada . “Incremental construction and maintenance of minimal finite-state automata”. Computational Linguistics. vol. 28, no. 2, pp. 207-216.
  10. Cartling, B. . “On the implicit acquisition of a context-free grammar by a simple recurrent neural network”. Neurocomputing. vol. 71, no. 7-9, pp. 1527-1537.
  11. Chang, C. H. and R. Paige . “From regular expressions to DFAs using compressed NFAs”. In Proceedings of the 3rd Annual Symposium on Combinatorial Pattern Matching. Lecture notes in Computer Science no. 644. Springer-Verlag, London. pp. 90-110.
  12. Chapman, N. P. . LR Parsing Theory and Practice. Cambridge University Press. New York.
  13. Charniak, E. . Statistical Language Learning. Mass. [u.a.] : MIT Press. Cambridge, MA.
  14. Cook, D. J., and L. B. Holder . “Graph-based data mining”. IEEE Intelligent Systems. vol. 15, no. 2, pp. 32-41.
  15. Daciuk, J., S. Mihov, B. W. Watson, and R. E. Watson . “Incremental construction of minimal acyclic finite-state automata”. Computational Linguistics. vol. 26, no. 1, pp. 3-16.
  16. Dobnikar, A. and B. Šter . “Structural properties of recurrent neural networks”. Neural Processing Letters. vol. 29, no. 2, pp. 75-88.
  17. erňanský, M., M. Makula, and u. Beňušková . “Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures”. Neural Networks. vol. 20, no. 2, pp. 236-244.
  18. Ehrig, H. . Graph-Grammars and their Application to Computer Science. Springer-Verlag, New York, Inc. Secaucus, NJ, USA.
  19. Elman, J. L. . “Distributed representations, simple recurrent networks, and grammatical structure”. Machine Learning. vol. 7, no. 2-3, pp. 195-224.
  20. Frazier, L. . “Syntactic processing: Evidence from Dutch”. Natural Language and Linguistic Theory. vol. 5, no. 4, pp. 519-559.
  21. Ghezzi, C. and D. Mandrioli . “Incremental parsing”. ACM Transactions on Programming Languages and Systems. vol. 1, no. 1, pp. 58-70.
  22. Grüning, A. . “Elman backpropagation as reinforcement for simple recurrent networks”. Neural Computation. vol. 19, no. 11, pp. 3108-3131.
  23. Hadley, R. F. and M. B. Hayward . “Strong semantic systematicity from hebbian connectionist learning”. Minds and Machines. vol. 7, no. 1, pp. 1-37.
  24. Hammer, B. and P. Tiňo . “Recurrent neural networks with small weights implement definite memory machines”. Neural Computation. vol. 15, no. 8, pp. 1897-1929.
  25. Hawkins, J. and M. Boden . “The applicability of recurrent neural networks for biological sequence analysis”. IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB). vol. 2, no. 3, pp. 243-253.
  26. Henderson, J. . “Neural network probability estimation for broad coverage parsing”. In Proceedings of the 10th Conference on European Chapter of the Association for Computational Linguistics - Volume 1. Association for Computational Linguistics, Morristown, NJ. pp. 131-138.
  27. Ho, E. K. S. and L. W. Chan . “How to design a connectionist holistic parser”. Neural Computation. vol. 11, no. 8, pp. 1995-2016.
  28. Hopcroft, J. E. and J. Ullman . Introduction to Automata Theory, Languages and Computation. Addison-Wesley Longman Publishing Company, Inc. Boston, MA, USA.
  29. Jain, A. N. and A. H. Waibel . “Incremental parsing by modular recurrent connectionist networks”. In Advances in Neural Information Processing Systems 2. Morgan Kaufmann Publishers, San Marco, CA. pp. 364-371.
  30. Johnson, W. L., J. H. Porter, S. I. Ackley, and D. T. Ross . “Automatic generation of efficient lexical processors using finite state techniques”. Communications of the ACM. vol. 11, no. 12, pp. 805-813.
  31. Kamide, Y. and D. C. Mitchell . “Incremental pre-head attachment in Japanese parsing”. Language and Cognitive Processes. vol. 14, no. 5-6, pp. 631-662.
  32. Kitano, H. . “Designing neural networks using genetic algorithms with graph generation systems”. Complex Systems. vol. 4, no. 4, pp. 461-476.
  33. Kolen, J. and S. Kremer . A Field Guide to Dynamical Recurrent Networks. IEEE Press. New York.
  34. Kruse R. L. . Data Structures and Program Design. 2nd edn. Prentice Hall of India Private Limited. New Delhi.
  35. Kukluk, J. P., L. B. Holder, and D. J. Cook . “Inference of node replacement graph grammars”. Intelligent Data Analysis. vol. 11, no. 4, pp. 377-400.
  36. Lane, P. C. R. and J. B. Henderson . “Incremental syntactic parsing of natural language corpora with simple synchrony networks”. IEEE Transactions on Knowledge and Data Engineering. vol. 13, no. 2, pp. 219-231.
  37. Lang, B. . “Parsing incomplete sentences”. In Proceedings of the 12th International Conference on Computational Linguistics Volume - 1. Association for Computational Linguistics, Morristown, NJ. pp. 365-371.
  38. Larchevêque, J. . “Optimal incremental parsing”. ACM Transactions on Programming Languages and Systems (TOPLAS). vol. 17, no. 1, pp. 1-15. Lavie, A. . “An integrated heuristic scheme for partial parse evaluation”. In Proceedings of the 32nd Annual Meeting on Association for Computational Linguistics. Association for Computational Linguistics, Morristown, NJ. pp. 316-318.
  39. Lombardo, V., L. Lesmo, L. Ferraris, and C. Seidenari . “Incremental processing and lexicalized grammars”. In Proceedings of 20th Annual Conference of the Cognitive Science Society, Madison, WI - USA. Lawrence Erlbaum Associates, Publishers Mahwah, New Jersey, London. pp. 621-626.
  40. Lombardo, V. and P. Sturt . “Incrementality and lexicalism: A treebank study”. In Lexical Representations in Sentence Processing. John Benjamins: Computational Psycholinuguistics Series - 2002. pp.137-156.
  41. Luerssen M. H. . “Graph grammar encoding and evolution of automata networks”. In Proceedings of the 28th Australasian Conference on Computer Science - Volume 38 (ACSC ’05). Australian Computer Society, Inc. Darlinghurst, Australia. pp. 229-238.
  42. Manning, C. D. and H. Schütze . Foundations of Statistical Natural Language Processing. Cambridge, Mass. [u.a.] : MIT Press.
  43. McCreary, C. L. . “Parsing a graph grammar”. In Proceedings of the 1988 ACM 16th Annual Conference on Computer Science (CSC ’88). ACM, New York, NY. pp. 249-255.
  44. Miikkulainen, R. . “Subsymbolic case-role analysis of sentences with embedded clauses”. Cognitive Science. vol. 20, no. 1, pp. 47-73.
  45. Milward, D. . “Incremental interpretation of categorial grammar”. In Proceedings of the 7th Conference on European Chapter of the Association for Computational Linguistics. Association for Computational Linguistics, Morgan Kaufmann Publishers, San Francisco, CA. pp. 119-126.
  46. Palm, A. . “The expressivity of tree languages for syntactic structures”. The Mathematics of Syntactic Structure: Trees and Their Logics. The theory of syntactic domains, Technical Report no. 75, Department of Philosophy, University of Utrecht. pp. 113-152.
  47. Patterson, D. W. . Introduction to Artificial Intelligence and Expert Systems. Pearson Education in South Asia. New Delhi.
  48. Roark, B. . “Efficient incremental beam-search parsing with generative and discriminative models: keynote talk”. In Proceedings of the Workshop on incremental Parsing: Bringing Engineering and Cognition Together. ACL Workshops. Association for Computational Linguistics, Morristown, NJ. pp. 16-17.
  49. Rozenberg, G. . Handbook of Graph Grammars and Computing by Graph Transformation. World Scientific.
  50. Rytter, W. . “A note on optimal parallel transformations of regular expressions to nondeterministic finite automata”. Information Processing Letters. vol. 31, no. 2, pp. 103-109.
  51. Schmid, H. . “A generative probability model for unification-based grammars”. In Proceedings of the 19th International Conference on Computational Linguistics - Volume 1. Association for Computational Linguistics, Morristown, NJ. pp. 1-7.
  52. Sharkey, N. . Connectionist Natural Language Processing. Intellect. Oxford, England.
  53. Stabler, E. P. . “The finite connectivity of linguistic structure”. In Perspectives on Sentence Processing. Lawrence Erlbaum Associates, Hillsdale, New Jersey Hove, UK. pp. 303-336.
  54. Steedman, M. J. . “Grammar, interpretation and processing from the lexicon”. In Lexical Representation and Process. MIT Press, Cambridge, MA. pp. 463-504.
  55. Sturt, P. and M. Crocker . “Monotonic syntactic processing: A cross linguistic study of attachment and reanalysis”. Language and Cognitive Processes. vol. 11, no. 5, pp. 449-494.
  56. Thompson, K. . “Regular expression search algorithms”. Communications of the ACM. vol. 11, no. 6, pp. 419-422.
  57. Tomita, M. . Efficient Parsing for Natural Language: a Fast Algorithm for Practical Systems. Kluwer Academic Publishers.
  58. Watson, B. . “Taxonomies and toolkits of regular language algorithms”. Ph.D. Thesis. Eindhoven University of Technology, CIP-DATA Koninklijke Bibliotheek, Den Haag.
  59. Wermter, S. and V. Weber . “SCREEN: Learning a flat syntactic and semantic spoken language analysis using artificial neural networks”. Journal of Artificial Intelligence Research. vol. 6, no. 1, pp. 35-85.
  60. Yamashita, K. . “Processing of Japanese and Korean”. Ph.D. Thesis. Ohio State University, Columbus, Ohio.
Index Terms

Computer Science
Information Sciences

Keywords

Parsing Graph grammar Regular expression Natural language processing