# Vorträge im Logik-Seminar

## Summer term 2019

## Winter term 2018/19

## Winter term 2017/18

Friday, February 9, 2018, in S2|15-201, at 13:30Dr. Florian Steinberg, INRIAtype-two poly-time and length revisions„We present a new, and fairly simple characterization of the type-two poly-time computable functionals. While being machine based, it avoids the use of higher order objects as running times and instead uses an approach first put forward by Cook that sticks to first order functions to formulate time constraints. Cook's class is known to not only contain feasible operations and we thus impose further restrictions. The purpose of these restrictions is very simple: It is to avoid unbounded iteration of functional input. We prove that this leads to a subclass of all feasible functionals and that closing under composition leads back to the class of all feasible functionals. We also prove that the same is true for a more restrictive condition, namely for the notion of strong poly-time.“ This is joint work with Bruce Kapron. |

Friday, February 2, 2018, in S2|15-201, at 13:30Dr. Adriana Nicolae, University of Seville„Lion Man“ and the Fixed Point PropertyIn this talk we are concerned with the relation between the ?xed point property for continuous mappings and a discrete lion and man game played in a strongly convex domain. Our main result states that in locally compact geodesic spaces or in R-trees, the ?xed point property of the domain is equivalent to the success of the lion. At the end we review a few applications of proof mining in the context of geodesic metric spaces. This talk is based on a joint work with Genaro López-Acedo (University of Seville) and Bozena Piatek (Silesian University of Technology). |

Friday, January 26, 2018, in S2|15-201, at 13:30Dr. Andrei SiposContributions to proof miningI will present the results of my PhD thesis, belonging to the area of proof mining, an applied subfield of mathematical logic. Proof mining (introduced and developed by U. Kohlenbach and his collaborators) aims to obtain quantitative information -- primarily witnesses and bounds for existentially quantified variables -- out of ordinary mathematical proofs of an apparently non-constructive nature. The results under discussion fall naturally within three categories. Firstly, there are some representative applications of such techniques in nonlinear analysis and convex optimization -- the extraction of rates of metastability, asymptotic regularity and even full convergence for some iterations studied in those fields, e.g. the parallel algorithm, the Ishikawa iteration or the proximal point algorithm. Secondly, as this kind of research can easily spill over in related areas, I present some advances in pure analysis, like a characterization of 2-uniformly convex Banach spaces or a general definition that encompasses all the known instances of resolvent operators used in convex optimization. Finally, I present some work in the foundations of proof mining, namely a suitable implicit characterization of the class of L^p spaces that is amenable to a logical treatment, i.e. it allows for a suitable metatheorem that underlies possible applications. This is followed by a derivation of the modulus of uniform convexity for this class of spaces from this axiomatization. Some of these results are joint work with Laurentiu Leustean and Adriana Nicolae. |

Friday, November 3, 2017, in S2|15-201, at 13:30Anton Freund, University of Leeds,From Cut Elimination to Pi^1_1-ComprehensionI will begin with an informal explanation of cut elimination and its use in the ordinal analysis of Peano arithmetic. The latter shows that Pi^0_2-soundness is equivalent to the primitive recursive well-foundedness of an ordinal notation system. On a conceptual level, we observe that the focus is on low logical complexity. Next, I explain how an ordinal notation system can be relativized to a given well-order. This brings us to the level of Pi^1_2-statements. Many well-known axioms of that complexity have been characterized in terms of relativized ordinal notation systems: from arithmetical comprehension (Girard, Hirst) up to the existence of omega-models of bar induction (Rathjen and Vizcaíno). Finally, I present a new result at the next type-level: Following a suggestion by Montalbán and Rathjen, I relativize an ordinal notation system to a relativized ordinal notation system. In this way one can get a characterization of Pi^1_1-comprehension, which is a Pi^1_3-statement (preprint available as arXiv:1704.01662). |

Friday, October 6, 2017, in S2|15-201, at 13:30Prof. Dr. Martin Ziegler, KAIST School of ComputingFormal Verification in Imperative Multivalued Programming over Continuous Data Types(joint work with Gyesik Lee, Norbert Müller, Eike Neumann, Sewon Park, and Norbert Preining) Inspired and guided by the iRRAM C++ library (Müller 2001), we formally specify a programming language for the paradigm of EXACT REAL COMPUTATION: reliably operating on encapsulated CONTINUOUS data types such as (not necessarily algebraic) real numbers --- imperatively and exactly (no rounding errors) with primitives computable in the sense of Recursive Analysis and a necessarily modified multivalued=non-extensional semantics of tests. We then propose a complete logical system over two types, integers and real numbers, for rigorous correctness proofs of such algorithms. We extend the rules of Hoare Logic to support the formal derivation of such proofs, and have them verified in the Coq Proof Assistant. Our approach is exemplified with three simple numerical example problems: integer rounding, solving systems of linear equations, and continuous root finding. |

## Summer term 2017

Wednesday, September 13, 2017, at 14:00 in S2|15-201Eike NeumannContinuous enclosures of discontinuous problemsThe main theorem of computable analysis asserts that any computable function is continuous. Many interesting problems encountered in practice turn out to be discotinuous, however. It hence is a natural question how much partial information may be obtained on the solutions of a discontinuous problem in a continuous or computable way. To formalise this idea, we propose the notion of continuous enclosures: continuous multi-valued functions which take values in certain semilattices which encode partial information on the solutions of the original problem. We give some existence results and provide some explicit examples. |

Friday, July 21, 2017, at 13:30 in S2|15-201Julian Bitterlich, TU DarmstadtSpecial classes of acyclic groups and their applicationsI present some connections between results in monoid theory and in (finite) model theory. More specifically, we look at theorems about the profinite topology on free groups (M. Hall, Ribes and Zalesskii) and compare these to theorems concerning extension problems for partial automorphisms (Hrushovski, Herwig and Lascar). The two threads can be related through a suitable notion of acyclicity in groups, which provides yet another viewpoint on the topic. |

Friday, July 7, 2017, at 13:30 in S2|15-201Felix Canavoi, TU DarmstadtA Modal Characterisation Theorem for Common Knowledge LogicWe investigate epistemic modal logic with common knowledge modalities for groups of agents and obtain two van Benthem style model-theoretic characterisation theorems, a classical version and one in the sense of finite model theory. In order to do that we show that basic modal logic is expressively equivalent to the bisimulation invariant fragment of first-order logic over the non-elementary class of so-called (finite) common knowledge Kripke frames. The transitive nature of common knowledge modalities poses a new challenge that cannot be tackled by the usual locality based techniques. For (finite) common knowledge Kripke frames we construct (finite) bisimilar companions that are based on Cayley graphs of suitable groups with specific acyclicity properties. These Cayley graphs possess a very intricate yet highly regular structure that allows for a locality analysis on multiple scales that are induced by distance measures w.r.t. various coalitions of agents. |

Friday, June 30, 2017, at 13:30 in S2|15-201Prof. Andrey Morozov, Sobolev Institute of Mathematics, NovosibirskInfinite time Blum-Shub-Smale machines for computability in analysis(joint work with Peter Koepke) We continue the study of the computational strength of infinite time Blum-Shub-Smale machines (ITBM) performing computations over the reals in ordinal time started by P.Koepke and B.Seyfferth. We give a characterization of ITBM-computable functions via iterated Turing jump and discuss the adequacy of this approach to our intuitive imaginations of computability used in analysis. |

Friday, June 16, 2017, at 13:30 in S2|15-201Prof. Rodney G. Downey, Victoria University of WellingtonNotes on Computability in MathematicsWe look at some recent advances in the understanding of the computational content of mathematics. I will mainly concentrate on problems in and around abelian groups. |

Tuesday, April 11, 2017, at 11:00 in S2|15-201Michal Konecny, Aston UniversityRepresentations for feasibly approximable functions(joint work with Eike Neumann) 'Exact' continuous real functions can feature as first-class objects in computation in a similar way as so-called exact real numbers. While all natural representations of continuous functions used in computable analysis are computably equivalent, different admissible representations may fail to be equivalent under the polynomial-time reduction defined by Kawamura and Cook (2012). Building on work by Labhalla, Lombardi, and Moutai (2001) and Kawamura, Mueller, Roesnick, and Ziegler (2015), we review several common admissible representations of continuous real functions, study their polynomial-time reducibility lattice and benchmark their performance using AERN, a Haskell library for exact real computation. In particular, we show that rigorous approximation by piecewise polynomials is polynomial-time equivalent to rigorous approximation by rational functions. Each representation of continuous functions offers a trade-off between the ease of computing names of functions (ie writing) and the ease of extracting information about a function (ie reading). We formalise this concept by switching from representations to compositional evaluation strategies over term signatures. A signature includes a set of function operations, some of which just create functions, some read existing functions (eg computing the range) and some do both (eg pointwise addition). We reformulate some existing results about function representations as Pareto optimality with respect to certain signatures. We implemented evaluation strategies based on the chosen representations and we benchmark their performance on a small set of terms. |