Free download. Book file PDF easily for everyone and every device. You can download and read online Exploring time, tense, and aspect in natural language database interfaces file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Exploring time, tense, and aspect in natural language database interfaces book. Happy reading Exploring time, tense, and aspect in natural language database interfaces Bookeveryone. Download file Free Book PDF Exploring time, tense, and aspect in natural language database interfaces at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Exploring time, tense, and aspect in natural language database interfaces Pocket Guide.

This is the first in-depth exploration of the notion of time in natural language database interfaces. It will be particularly interesting to researchers working on natural language interaction, tense and aspect, HPSG, temporal logics, and temporal databases, especially those who wish to learn about time-related issues in other disciplines. The framework addresses difficult problems involved in making natural language queries to temporal databases possible, and together with its prototype interface, it constitutes a foundation from which more complex interfaces may be developed and the value of representing event structure in temporal databases may be explored.

However, the discussion in the book is leading the state of the art in NLIDBs towards more empirical and less exploratory work. Moreover, the book contains clear place-holders for researching issues such as evaluation and cooperative response generation. On that behalf, the book provides a necessary milestone on that worthy path.

Pablo Ariel Duboue on Linguist List Abuarrah, Sufyan Time tells a story. Jensen For example, executing the query 2 will pull out the value 'greece' :. This specifies a result set consisting of all values for the column Country in data rows where the value of the City column is 'athens'. How can we get the same effect using English as our input to the query system? The feature-based grammar formalism described in 9. The grammar sql0. Each phrase structure rule is supplemented with a recipe for constructing a value for the feature sem.

Your Turn: Run the parser with maximum tracing on, i. Finally, we execute the query over the database city. Since each row r is a one-element tuple, we print out the member of the tuple rather than tuple itself. To summarize, we have defined a task where the computer returns useful data in response to a natural language query, and we implemented this by translating a small subset of English into SQL.

This parallels being able to translate from Dutch into English as an example of natural language understanding. Suppose that you are a native speaker of English, and have started to learn Dutch. Your teacher asks if you understand what 3 means:. If you know the meanings of the individual words in 3 , and know how these meanings are combined to make up the meaning of the whole sentence, you might say that 3 means the same as Margrietje loves Brunoke.

An observer — let's call her Olga — might well take this as evidence that you do grasp the meaning of 3. But this would depend on Olga herself understanding English. If she doesn't, then your translation from Dutch to English is not going to convince her of your ability to understand Dutch. We will return to this issue shortly. How adequate is this grammar? You saw that the SQL translation for the whole sentence was built up from the translations of the components.

However, there does not seem to be a lot of justification for these component meaning representations. But neither of these have a well-defined meaning in isolation from the other. There is another criticism we can level at the grammar: we have "hard-wired" an embarrassing amount of detail about the database into it. We need to know the name of the relevant table e. But our database could have contained exactly the same rows of data yet used a different table name and different field names, in which case the SQL queries would not be executable. Equally, we could have stored our data in a different format, such as XML, in which case retrieving the same results would require us to translate our English queries into an XML query language rather than SQL.

These considerations suggest that we should be translating English into something that is more abstract and generic than SQL. What cities are in China and have populations above 1,,? Your Turn: Extend the grammar sql0. You will probably find it easiest to first extend the grammar to handle queries like What cities have populations above 1,, before tackling conjunction.

The latter tells us to select results from rows where two conditions are true together: the value of the Country column is 'china' and the value of the Population column is greater than This interpretation for and involves a new idea: it talks about what is true in some particular situation , and tells us that Cond1 AND Cond2 is true in situation s just in case that condition Cond1 is true in s and condition Cond2 is true in s. Although this doesn't account for the full range of meanings of and in English, it has the nice property that it is independent of any query language.

In fact, we have given it the standard interpretation from classical logic. In the following sections, we will explore an approach in which sentences of natural language are translated into logic instead of an executable query language such as SQL. One advantage of logical formalisms is that they are more abstract and therefore more generic. If we wanted to, once we had our translation into logic, we could then translate it into various other special-purpose languages.

In fact, most serious attempts to query databases via natural language have used this methodology. We started out trying to capture the meaning of 1a by translating it into a query in another language, SQL, which the computer could interpret and execute.

Removing HTML tags

But this still begged the question whether the translation was correct. Stepping back from database query, we noted that the meaning of and seems to depend on being able to specify when statements are true or not in a particular situation. Instead of translating a sentence S from one language to another, we try to say what S is about by relating it to a situation in the world.

Let's pursue this further. Imagine there is a situation s where there are two entities, Margrietje and her favourite doll, Brunoke. In addition, there is a relation holding between the two entities, which we will call the love relation.

If you understand the meaning of 3 , then you know that it is true in situation s. In part, you know this because you know that Margrietje refers to Margrietje, Brunoke refers to Brunoke, and houdt van refers to the love relation. We have introduced two fundamental notions in semantics. The first is that declarative sentences are true or false in certain situations. The second is that definite noun phrases and proper nouns refer to things in the world. So 3 is true in a situation where Margrietje loves the doll Brunoke, here illustrated in 1.

Figure 1. Once we have adopted the notion of truth in a situation, we have a powerful tool for reasoning. In particular, we can look at sets of sentences, and ask whether they could be true together in some situation. For example, the sentences in 5 can be both true, while those in 6 and 7 cannot be.


  • The Rebel and the Lady (Mills & Boon Historical).
  • Japan in the World;
  • Linguistic Typology!

In other words, the sentences in 5 are consistent , while those in 6 and 7 are inconsistent. Sylvania is to the north of Freedonia. Freedonia is a republic. The capital of Freedonia has a population of 9, No city in Freedonia has a population of 9, Freedonia is to the north of Sylvania. We have chosen sentences about fictional countries featured in the Marx Brothers' movie Duck Soup to emphasize that your ability to reason about these examples does not depend on what is true or false in the actual world.

If you know the meaning of the word no , and also know that the capital of a country is a city in that country, then you should be able to conclude that the two sentences in 6 are inconsistent, regardless of where Freedonia is or what the population of its capital is. That is, there's no possible situation in which both sentences could be true.


  1. The Interface Between Intellectual Property Rights and Competition Policy!
  2. Project MUSE - Exploring Time, Tense and Aspect in Natural Language Database Interfaces (review).
  3. Councillors in Crisis: The Public and Private Worlds of Local Councillors!
  4. Kundrecensioner!
  5. Computational Linguistics!
  6. Analyzing the Meaning of Sentences.
  7. Similarly, if you know that the relation expressed by to the north of is asymmetric, then you should be able to conclude that the two sentences in 7 are inconsistent. Broadly speaking, logic-based approaches to natural language semantics focus on those aspects of natural language which guide our judgments of consistency and inconsistency. The syntax of a logical language is designed to make these features formally explicit.

    As a result, determining properties like consistency can often be reduced to symbolic manipulation, that is, to a task that can be carried out by a computer. In order to pursue this approach, we first want to develop a technique for representing a possible situation. We do this in terms of something that logicians call a model.

    A model for a set W of sentences is a formal representation of a situation in which all the sentences in W are true. The usual way of representing models involves set theory. The domain D of discourse all the entities we currently care about is a set of individuals, while relations are treated as sets built up from D.

    Let's look at a concrete example. Our domain D will consist of three children, Stefan, Klaus and Evi, represented respectively as s , k and e. The expression boy denotes the set consisting of Stefan and Klaus, the expression girl denotes the set consisting of Evi, and the expression is running denotes the set consisting of Stefan and Evi. Later in this chapter we will use models to help evaluate the truth or falsity of English sentences, and in this way to illustrate some methods for representing meaning.

    However, before going into more detail, let's put the discussion into a broader perspective, and link back to a topic that we briefly raised in 5. Can a computer understand the meaning of a sentence? And how could we tell if it did? This is similar to asking "Can a computer think?

    Suppose you are having a chat session with a person and a computer, but you are not told at the outset which is which. If you cannot identify which of your partners is the computer after chatting with each of them, then the computer has successfully imitated a human. If a computer succeeds in passing itself off as human in this "imitation game" or "Turing Test" as it is popularly known , then according to Turing, we should be prepared to say that the computer can think and can be said to be intelligent.

    So Turing side-stepped the question of somehow examining the internal states of a computer by instead using its behavior as evidence of intelligence. By the same reasoning, we have assumed that in order to say that a computer understands English, it just needs to behave as though it did. What is important here is not so much the specifics of Turing's imitation game, but rather the proposal to judge a capacity for natural language understanding in terms of observable behavior. A logical language is designed to make reasoning formally explicit.

    As a result, it can capture aspects of natural language which determine whether a set of sentences is consistent. We'll start off with a simple example:. This structure is the logical form of 8. Propositional logic allows us to represent just those parts of linguistic structure which correspond to certain sentential connectives. We have just looked at and. Other such connectives are not , or and if In the formalization of propositional logic, the counterparts of such connectives are sometimes called boolean operators. The basic expressions of propositional logic are propositional symbols , often written as P , Q , R , etc.

    There are varying conventions for representing boolean operators. From the propositional symbols and the boolean operators we can build an infinite set of well formed formulas or just formulas, for short of propositional logic. First, every propositional letter is a formula. The 2. Table 2. These rules are generally straightforward, though the truth conditions for implication departs in many cases from our usual intuitions about the conditional in English.

    VTLS Chameleon iPortal نتائج البحث

    From a computational perspective, logics give us an important tool for performing inference. Suppose you state that Freedonia is not to the north of Sylvania, and you give as your reasons that Sylvania is to the north of Freedonia. In this case, you have produced an argument. The sentence Sylvania is to the north of Freedonia is the assumption of the argument while Freedonia is not to the north of Sylvania is the conclusion. The step of moving from one or more assumptions to a conclusion is called inference. Informally, it is common to write arguments in a format where the conclusion is preceded by therefore.

    Therefore, Freedonia is not to the north of Sylvania. An argument is valid if there is no possible situation in which its premises are all true and its conclusion is not true. Now, the validity of 9 crucially depends on the meaning of the phrase to the north of , in particular, the fact that it is an asymmetric relation:. Unfortunately, we can't express such rules in propositional logic: the smallest elements we have to play with are atomic propositions, and we cannot "look inside" these to talk about relations between individuals x and y.

    The best we can do in this case is capture a particular case of the asymmetry. To say that Freedonia is not to the north of Sylvania , we write -FnS That is, we treat not as equivalent to the phrase it is not the case that So now we can write the implication in 10 as. How about giving a version of the complete argument? We will replace the first sentence of 9 by two formulas of propositional logic: SnF , and also the implication in 11 , which expresses rather poorly our background knowledge of the meaning of to the north of. We'll write [A1, This leads to the following as a representation of argument 9 :.

    By contrast, if FnS were true, this would conflict with our understanding that two objects cannot both be to the north of each other in any possible situation. Arguments can be tested for "syntactic validity" by using a proof system. We will say a little bit more about this later on in 3.

    Logical proofs can be carried out with NLTK's inference module, for example via an interface to the third-party theorem prover Prover9. The inputs to the inference mechanism first have to be converted into logical expressions. Here's another way of seeing why the conclusion follows. If SnF is true, then -SnF cannot also be true; a fundamental assumption of classical logic is that a sentence cannot be both true and false in a situation. Consequently, -FnS must be true. Recall that we interpret sentences of a logical language relative to a model, which is a very simplified version of the world.

    A model for propositional logic needs to assign the values True or False to every possible formula. We do this inductively: first, every propositional symbol is assigned a value, and then we compute the value of complex formulas by consulting the meanings of the boolean operators i. A Valuation is a mapping from basic expressions of the logic to their values. Here's an example:.

    We initialize a Valuation with a list of pairs, each of which consists of a semantic symbol and a semantic value. The resulting object is essentially just a dictionary that maps logical expressions treated as strings to appropriate values. As we will see later, our models need to be somewhat more complicated in order to handle the more complex logical forms discussed in the next section; for the time being, just ignore the dom and g parameters in the following declarations.

    Now let's initialize a model m that uses val :. Every model comes with an evaluate method, which will determine the semantic value of logical expressions, such as formulas of propositional logic; of course, these values depend on the initial truth values we assigned to propositional symbols such as P , Q and R. Your Turn: Experiment with evaluating different formulas of propositional logic. Does the model give the values that you expected?

    Up until now, we have been translating our English sentences into propositional logic.

    Because we are confined to representing atomic sentences with letters like P and Q , we cannot dig into their internal structure. In effect, we are saying that there is nothing of logical interest to dividing atomic sentences into subjects, objects and predicates. However, this seems wrong: if we want to formalize arguments such as 9 , we have to be able to "look inside" basic sentences.

    strilok.com.ua/image/sox-te-rencontrer.php

    10. Analyzing the Meaning of Sentences

    As a result, we will move beyond Propositional Logic to a something more expressive, namely First-Order Logic. This is what we turn to in the next section. In the remainder of this chapter, we will represent the meaning of natural language expressions by translating them into first-order logic. Not all of natural language semantics can be expressed in first-order logic. But it is a good choice for computational semantics because it is expressive enough to represent a good deal, and on the other hand, there are excellent systems available off the shelf for carrying out automated inference in first order logic.

    Verb Tense and Aspect - Parts of Speech

    Our next step will be to describe how formulas of first-order logic are constructed, and then how such formulas can be evaluated in a model. First-order logic keeps all the boolean operators of Propositional Logic. But it adds some important new mechanisms.

    To start with, propositions are analyzed into predicates and arguments, which takes us a step closer to the structure of natural languages. The standard construction rules for first-order logic recognize terms such as individual variables and individual constants, and predicates which take differing numbers of arguments.

    For example, Angus walks might be formalized as walk angus and Angus sees Bertie as see angus, bertie. We will call walk a unary predicate , and see a binary predicate. The symbols used as predicates do not have intrinsic meaning, although it is hard to remember this. Returning to one of our earlier examples, there is no logical difference between 13a and 13b.

    By itself, first-order logic has nothing substantive to say about lexical semantics — the meaning of individual words — although some theories of lexical semantics can be encoded in first-order logic. Whether an atomic predication like see angus, bertie is true or false in a situation is not a matter of logic, but depends on the particular valuation that we have chosen for the constants see , angus and bertie. For this reason, such expressions are called non-logical constants. By contrast, logical constants such as the boolean operators always receive the same interpretation in every model for first-order logic.

    It is often helpful to inspect the syntactic structure of expressions of first-order logic, and the usual way of doing this is to assign types to expressions. Following the tradition of Montague grammar, we will use two basic types : e is the type of entities, while t is the type of formulas, i. Given these two basic types, we can form complex types for function expressions.

    The logical expression can be processed with type checking. Although the type-checker will try to infer as many types as possible, in this case it has not managed to fully specify the type of walk , since its result type is unknown. To help the type-checker, we need to specify a signature , implemented as a dictionary that explicitly associates types with non-logical constants:.

    Although this is the type of something which combines first with an argument of type e to make a unary predicate, we represent binary predicates as combining directly with their two arguments. For example, the predicate see in the translation of Angus sees Cyril will combine with its arguments to give the result see angus, cyril. In first-order logic, arguments of predicates can also be individual variables such as x , y and z.

    In NLTK, we adopt the convention that variables of type e are all lowercase. Individual variables are similar to personal pronouns like he , she and it , in that we need to know about the context of use in order to figure out their denotation. One way of interpreting the pronoun in 14 is by pointing to a relevant individual in the local context. Another way is to supply a textual antecedent for the pronoun he , for example by uttering 15a prior to Here, we say that he is coreferential with the noun phrase Cyril.

    As a result, 14 is semantically equivalent to 15b. Cyril is Angus's dog. Cyril disappeared. Consider by contrast the occurrence of he in 16a. In this case, it is bound by the indefinite NP a dog , and this is a different relationship than coreference. If we replace the pronoun he by a dog , the result 16b is not semantically equivalent to 16a.

    Angus had a dog but he disappeared. Angus had a dog but a dog disappeared. Corresponding to 17a , we can construct an open formula 17b with two occurrences of the variable x. We ignore tense to simplify exposition. He is a dog and he disappeared. At least one entity is a dog and disappeared. A dog disappeared. Everything has the property that if it is a dog, it disappears. Every dog disappeared. Although 20a is the standard first-order logic translation of 20c , the truth conditions aren't necessarily what you expect. The formula says that if some x is a dog, then x disappears — but it doesn't say that there are any dogs.

    So in a situation where there are no dogs, 20a will still come out true. Now you might argue that every dog disappeared does presuppose the existence of dogs, and that the logic formalization is simply wrong. But it is possible to find other examples which lack such a presupposition.

    For instance, we might explain that the value of the Python expression astring. We have seen a number of examples where variables are bound by quantifiers. Related work and directions for further research; 8. References; 9. Index; Du kanske gillar. Automatic Summarization Inderjeet Mani Inbunden. Inbunden Engelska, Spara som favorit.