{"title":"Tournaments, Johnson Graphs, and NC-Teaching","authors":"H. Simon","doi":"10.48550/arXiv.2205.02792","DOIUrl":null,"url":null,"abstract":"Quite recently a teaching model, called\"No-Clash Teaching\"or simply\"NC-Teaching\", had been suggested that is provably optimal in the following strong sense. First, it satisfies Goldman and Matthias' collusion-freeness condition. Second, the NC-teaching dimension (= NCTD) is smaller than or equal to the teaching dimension with respect to any other collusion-free teaching model. It has also been shown that any concept class which has NC-teaching dimension $d$ and is defined over a domain of size $n$ can have at most $2^d \\binom{n}{d}$ concepts. The main results in this paper are as follows. First, we characterize the maximum concept classes of NC-teaching dimension $1$ as classes which are induced by tournaments (= complete oriented graphs) in a very natural way. Second, we show that there exists a family $(\\cC_n)_{n\\ge1}$ of concept classes such that the well known recursive teaching dimension (= RTD) of $\\cC_n$ grows logarithmically in $n = |\\cC_n|$ while, for every $n\\ge1$, the NC-teaching dimension of $\\cC_n$ equals $1$. Since the recursive teaching dimension of a finite concept class $\\cC$ is generally bounded $\\log|\\cC|$, the family $(\\cC_n)_{n\\ge1}$ separates RTD from NCTD in the most striking way. The proof of existence of the family $(\\cC_n)_{n\\ge1}$ makes use of the probabilistic method and random tournaments. Third, we improve the afore-mentioned upper bound $2^d\\binom{n}{d}$ by a factor of order $\\sqrt{d}$. The verification of the superior bound makes use of Johnson graphs and maximum subgraphs not containing large narrow cliques.","PeriodicalId":267197,"journal":{"name":"International Conference on Algorithmic Learning Theory","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Algorithmic Learning Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2205.02792","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Quite recently a teaching model, called"No-Clash Teaching"or simply"NC-Teaching", had been suggested that is provably optimal in the following strong sense. First, it satisfies Goldman and Matthias' collusion-freeness condition. Second, the NC-teaching dimension (= NCTD) is smaller than or equal to the teaching dimension with respect to any other collusion-free teaching model. It has also been shown that any concept class which has NC-teaching dimension $d$ and is defined over a domain of size $n$ can have at most $2^d \binom{n}{d}$ concepts. The main results in this paper are as follows. First, we characterize the maximum concept classes of NC-teaching dimension $1$ as classes which are induced by tournaments (= complete oriented graphs) in a very natural way. Second, we show that there exists a family $(\cC_n)_{n\ge1}$ of concept classes such that the well known recursive teaching dimension (= RTD) of $\cC_n$ grows logarithmically in $n = |\cC_n|$ while, for every $n\ge1$, the NC-teaching dimension of $\cC_n$ equals $1$. Since the recursive teaching dimension of a finite concept class $\cC$ is generally bounded $\log|\cC|$, the family $(\cC_n)_{n\ge1}$ separates RTD from NCTD in the most striking way. The proof of existence of the family $(\cC_n)_{n\ge1}$ makes use of the probabilistic method and random tournaments. Third, we improve the afore-mentioned upper bound $2^d\binom{n}{d}$ by a factor of order $\sqrt{d}$. The verification of the superior bound makes use of Johnson graphs and maximum subgraphs not containing large narrow cliques.