{"title":"Grocery Shopping Has Become Complicated. Managing Dual-tasks in a Supermarket for Patients with Alzheimer’s Disease","authors":"F. Morganti, Elena Minelli","doi":"10.37394/232020.2021.1.10","DOIUrl":"https://doi.org/10.37394/232020.2021.1.10","url":null,"abstract":"The executive functions neuropsychological assessment with paper and pencil tests is particularly sensitive to the lack of ecological validity. If the purpose of the clinician is not to get a merely theoretical measure, but to assess the deficit impact on patient’s life, the classical tests for executive functions are not very informative. A more ecological measure could be provided by the direct observation of the patient in his daily life, but this practice is largely cost-demanding and difficult to implement. Moreover by assuming an enactive cognition approach, the role of interaction between the embodied behaviors and the affordances provided in the coupling with the environment in which the agent’s activity is placed remains unrevealed from classical neuropsychological tests. Recently, the introduction of virtual reality simulations in clinical neuropsychology seems to provide peculiar opportunities for the evaluation of the ecological impact of disease in neurological patients, while introducing new inputs for the research on embodied cognition and enactive knowledge acquisition. This contribution aims in analyzing how a virtual reality dual-task test, set in the scene of a supermarket, can be used to assess executive functions in adult/elderly population and in patients with Alzheimer disease. The results show that the test has convergent validity, compared to the Mini Mental State Evaluation and Tower of London tests and discriminant validity in identifying the degenerative disease, compared to healthy subjects matched for age, sex and education.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"103 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76708974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling of the Plane Film Flow in Alternating Electromagnetic Field","authors":"Ivan Kazachkov","doi":"10.37394/232020.2021.1.9","DOIUrl":"https://doi.org/10.37394/232020.2021.1.9","url":null,"abstract":"The modeling of a plane film flow affected by alternating electromagnetic field (running EM wave) is considered in the paper. Basic parameters of film flow and specific peculiarities of parametrically excited oscillations in a film flow are studied and discussed for the theory, as well as for the diverse engineering and technological applications. The main attention is focused on the film flow spreading on a solid surface or in another liquid medium with comparably high velocities when inertia forces are playing together with capillary and electromagnetic ones (gravity forces are of lower impact due to high flow velocity). Scientific novelty of present study consists in the revealed new phenomena of the film flow oscillations and available decay or stabilization under appropriate practical statement.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"51 6 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79864210","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Geometric Probabilities in Euclidean Space E3","authors":"G. Caristi, A. Puglisi, E. Saitta","doi":"10.37394/232020.2021.1.8","DOIUrl":"https://doi.org/10.37394/232020.2021.1.8","url":null,"abstract":"In the last year G. Caristi and M. Stoka [2] have considered Laplace type problem for different lattice with or without obstacles and compute the associated probabilities by considering bodies test not-uniformly distributed. We consider a lattice with fundamental cell a parallelepiped in the Ecuclidean Space E3. We compute the probability that a random segment of constant length, with exponential distribution, intersects a side of the lattice","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"49 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88961160","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Novel Method of 2D Computation and Reconstruction","authors":"D. Jakóbczak","doi":"10.37394/232020.2021.1.6","DOIUrl":"https://doi.org/10.37394/232020.2021.1.6","url":null,"abstract":"Proposed method, called Probabilistic Nodes Combination (PNC), is the method of 2D curve modeling and handwriting identification by using the set of key points. Nodes are treated as characteristic points of signature or handwriting for modeling and writer recognition. Identification of handwritten letters or symbols need modeling and the model of each individual symbol or character is built by a choice of probability distribution function and nodes combination. PNC modeling via nodes combination and parameter γ as probability distribution function enables curve parameterization and interpolation for each specific letter or symbol. Two-dimensional curve is modeled and interpolated via nodes combination and different functions as continuous probability distribution functions: polynomial, sine, cosine, tangent, cotangent, logarithm, exponent, arc sin, arc cos, arc tan, arc cot or power function.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"60 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77754422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Simulation of Extreme Insured Losses in Natural Catastrophes","authors":"V. Pacáková, P. Jindrová","doi":"10.37394/232020.2021.1.5","DOIUrl":"https://doi.org/10.37394/232020.2021.1.5","url":null,"abstract":"This article aims to present the application of probability modelling and simulations based on quantile function of extreme insured losses in the world natural catastrophes based on data in time period 1970-2014, published in Swiss Re Sigma No2/2015. Quantile function provides an appropriate and flexible approach to the probability modelling needed to obtain well-fitted tails. We are specifically interested in modelling and simulations the tails of loss distributions. In a number of applications of quantile functions in insurance and reinsurance risk management interest focuses particularly on the extreme observations in the upper tail of probability distribution. Fortunately it is possible to simulate the observations in one tail of distribution without simulating the central values. This advantage will be used for estimate a few extreme high insured losses in the world’s natural catastrophes in future.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"14 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74012155","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Buda, Emilia Calefariu, F. Sârbu, Garila Calefariu
{"title":"A New Model for Determining Production Capacity","authors":"T. Buda, Emilia Calefariu, F. Sârbu, Garila Calefariu","doi":"10.37394/232020.2021.1.4","DOIUrl":"https://doi.org/10.37394/232020.2021.1.4","url":null,"abstract":"Traian Alexandru Buda, EThis paper introduces a new model for computing the production capacity in a manufacturing system or in a supply chain. Solving the production capacity problem means to be able to answer the following question: how many parts, from each product, can be produced by a given manufacturing system in a given time span considering the product mix and a multi-stage Bill of Materials? The proposed model is able to determine the production capacity and the loading level per resource for a manufacturing system using as inputs the Bill of Materials, Routing file, time span and product mix. The novelty brought by this method consists in the adoption of the matrix calculus in order to manipulate the inputs to obtain the requested outputs. The opportunity of such a model is that it offers the complete view on the capacity problem with a full range of answers: the available and required capacity at resource and finished product level and the loading level for each resource. Also the facile implementation and integration in ERP systems is a vital point. The use of such a model is in the investment process, middle and long-term capacity planning and client order confirmation process. The model aims to solveany type of manufacturing system.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"9 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87501491","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Student Perceptions on the Effectiveness of Collaborative Problem-based Learning Using Online Pair Programming Tools","authors":"Maira Kotsovoulou, Vassilia Stefanou","doi":"10.37394/232020.2021.1.3","DOIUrl":"https://doi.org/10.37394/232020.2021.1.3","url":null,"abstract":"his paper describes a qualitative study of how undergraduate students majoring in Information Technology perceive the effectiveness and evaluate the learning experience of pair-programming. The phenomenographic research approach was used to analyze student interviews and revealed 4 categories of descriptions: Effective Problem Solving, Participation, Enjoyment and Coding. Pair-programming as a teaching methodology was commonly perceived as a positive experience. The resulting outcome space maps a logical hierarchy of students’ conceptions of reality (categories of description). Findings of this research identify the factors that affect student engagement in a problem-solving process and can be used as a guiding principle on how to improve students’ learning experience of computer programming.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"17 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89292316","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimal Indebtedness of the Concrete Firm in the Passive Way of Optimizing","authors":"M. Hrdý","doi":"10.37394/232020.2021.1.2","DOIUrl":"https://doi.org/10.37394/232020.2021.1.2","url":null,"abstract":"The aim of this article is to analyze the problem of optimizing of the capital structure of the concrete firm in the passive way of optimizing and to find some possible branch standards in the chosen branches. The most important problem in the theory of optimizing is to identify the theory which best fit and in practice it is necessary to cope with some application problems which results from the use of the concrete theory. The firm in fact has to decide firstly if to start the process of active optimizing or if it is satisfied with the passive way of optimizing by respecting the branch standards or by respecting the Pecking Order Theory. In case of branch standards the firm is trying to come near the average indebtedness in some concrete branch and in case of the Pecking Order Theory the firm follows the recommendations of this theory and use firstly the internal financial sources, then debt and finally the equity. The most important application problem in case of using the branch standards is the identification of the average indebtedness in some concrete branch and to judge if this average indebtedness is possible to use as this standard or if it is necessary to adjust this amount according to the other factors. In case of using the Pecking Order Theory it is necessary for the firm to have all financial sources available otherwise the process of optimizing means only findings actually any kind of financial source. The concrete branches were chosen and some concrete data were analyzed. It was found out that the branch standards are relatively complicated way of optimizing of the capital structure of the firm, but relatively feasible. The branch standards on the level of 50 % of indebtedness were recommended for the branch of Beverages and Construction Supplies, on the level of 45 % for the branch of Railroads, on the level of 40 % for the branch of Electrical Equipment and on the level of 35 % for the branch of Machinery.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"2 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77818777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Anikin, Dmitry Litovkin, M. Kultsova, Volgograd
{"title":"An Ontology-based Approach to Collaborative Development of Domain Information Space","authors":"A. Anikin, Dmitry Litovkin, M. Kultsova, Volgograd","doi":"10.37394/232020.2021.1.1","DOIUrl":"https://doi.org/10.37394/232020.2021.1.1","url":null,"abstract":"This paper is devoted to a problem of creating the domain information space for the purpose of further information retrieval and reuse in different subject domains. We propose an ontology-based approach to collaborative development of the domain information space using the personal human cognitive spaces. In framework of the proposed approach the information space ontology is constructed and the algorithm for generation of domain information space is developed on the base of personal cognitive spaces using reasoning on ontology. The application of the proposed approach is illustrated by the example of the information space construction for subject domain of programming languages.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91274993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Killing Kaplanism: Flawed methodologies, the standard of proof and modernity","authors":"William Cullerne Bown","doi":"10.1177/1365712718798387","DOIUrl":"https://doi.org/10.1177/1365712718798387","url":null,"abstract":"Attempts to establish a quantitative framework for policy-making in the criminal justice system in recent decades have coalesced around the problem of the standard of proof and Kaplan’s influential 1968 paper. The central thread of work continues to use an equation he put forward while abandoning some of his foundational assumptions, an approach I call ‘Kaplanism’. Despite a growing awareness of deficiencies, elements of this school of thought, such as the parsing of concerns into the two categories of ‘error reduction’ and ‘error distribution’, have entered the general jurisprudential discourse. Here I launch a methodological attack and claim to kill this approach. This allows me to refute Laudan and other ‘consequentialist’ approaches to the standard identified by Walen, Walen’s own approach and an important part of Stein’s underpinnings. The same tools allow me to also refute Laudan’s earlier m/n meta-epistemology, Lippke’s ‘adage’, Stewart’s formalisation of Dworkin, Dahlman’s Bayesian work and (at least in criminal law) Kaplow’s law and economics approach. I also refute Hamer’s ‘conventional rationale’ for the current standard, Lillquist’s approach to the same and what Epps reports as ‘the Blackstone principle’. The law is left with no epistemic basis for policies, which, I argue, leaves it struggling for public trust in the modern era.","PeriodicalId":93382,"journal":{"name":"The international journal of evidence & proof","volume":"6 1","pages":"229 - 254"},"PeriodicalIF":0.0,"publicationDate":"2018-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75333637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}