learning of monotone functions is a well-known problem. Results obtained by V. K. Korobkov and G. Hansel imply that the complexity phi(M)(n) of learning of monotone Boolean functions equals C-n((sic)n/2(sic))+C-n((sic...
详细信息
learning of monotone functions is a well-known problem. Results obtained by V. K. Korobkov and G. Hansel imply that the complexity phi(M)(n) of learning of monotone Boolean functions equals C-n((sic)n/2(sic))+C-n((sic)n/2(sic)+1) (phi(M)(n) denotes the least number of queries on the value of an unknown monotone function on a given input sufficient to identify an arbitrary..-ary monotone function). In our paper we consider learning of monotone functions in the case when the teacher is allowed to return an incorrect response to at most one query on the value of an unknown function so that it is still possible to correctly identify the function. We show that learning complexity in case of the possibility of a single error is equal to the complexity in the situation when all responses are correct.
Sublearning, a model for learning of subconcepts of a concept, is presented. Sublearning a class of total recursive functions informally means to learn all functions from that class together with all of their subfunct...
详细信息
ISBN:
(纸本)3540407200
Sublearning, a model for learning of subconcepts of a concept, is presented. Sublearning a class of total recursive functions informally means to learn all functions from that class together with all of their subfunctions. While in language learning it is known to be impossible to learn any infinite language together with all of its sublanguages, the situation changes for sublearning of functions. Several types of sublearning are defined and compared to each other as well as to other learning types. For example, in some cases, sublearning coincides with robust learning. Furthermore, whereas in usual function learning there are classes that cannot be learned consistently, all sublearnable classes of some natural types can be learned consistently. Moreover, the power of sublearning is characterized in several terms, thereby establishing a close connection to measurable classes and variants of this notion. As a consequence, there are rich classes which do not need any self-referential coding for sublearning them.
暂无评论