- •Preface
- •Contents
- •Contributors
- •Modeling Meaning Associated with Documental Entities: Introducing the Brussels Quantum Approach
- •1 Introduction
- •2 The Double-Slit Experiment
- •3 Interrogative Processes
- •4 Modeling the QWeb
- •5 Adding Context
- •6 Conclusion
- •Appendix 1: Interference Plus Context Effects
- •Appendix 2: Meaning Bond
- •References
- •1 Introduction
- •2 Bell Test in the Problem of Cognitive Semantic Information Retrieval
- •2.1 Bell Inequality and Its Interpretation
- •2.2 Bell Test in Semantic Retrieving
- •3 Results
- •References
- •1 Introduction
- •2 Basics of Quantum Probability Theory
- •3 Steps to Build an HSM Model
- •3.1 How to Determine the Compatibility Relations
- •3.2 How to Determine the Dimension
- •3.5 Compute the Choice Probabilities
- •3.6 Estimate Model Parameters, Compare and Test Models
- •4 Computer Programs
- •5 Concluding Comments
- •References
- •Basics of Quantum Theory for Quantum-Like Modeling Information Retrieval
- •1 Introduction
- •3 Quantum Mathematics
- •3.1 Hermitian Operators in Hilbert Space
- •3.2 Pure and Mixed States: Normalized Vectors and Density Operators
- •4 Quantum Mechanics: Postulates
- •5 Compatible and Incompatible Observables
- •5.1 Post-Measurement State From the Projection Postulate
- •6 Interpretations of Quantum Mechanics
- •6.1 Ensemble and Individual Interpretations
- •6.2 Information Interpretations
- •7 Quantum Conditional (Transition) Probability
- •9 Formula of Total Probability with the Interference Term
- •9.1 Växjö (Realist Ensemble Contextual) Interpretation of Quantum Mechanics
- •10 Quantum Logic
- •11 Space of Square Integrable Functions as a State Space
- •12 Operation of Tensor Product
- •14 Qubit
- •15 Entanglement
- •References
- •1 Introduction
- •2 Background
- •2.1 Distributional Hypothesis
- •2.2 A Brief History of Word Embedding
- •3 Applications of Word Embedding
- •3.1 Word-Level Applications
- •3.2 Sentence-Level Application
- •3.3 Sentence-Pair Level Application
- •3.4 Seq2seq Application
- •3.5 Evaluation
- •4 Reconsidering Word Embedding
- •4.1 Limitations
- •4.2 Trends
- •4.4 Towards Dynamic Word Embedding
- •5 Conclusion
- •References
- •1 Introduction
- •2 Motivating Example: Car Dealership
- •3 Modelling Elementary Data Types
- •3.1 Orthogonal Data Types
- •3.2 Non-orthogonal Data Types
- •4 Data Type Construction
- •5 Quantum-Based Data Type Constructors
- •5.1 Tuple Data Type Constructor
- •5.2 Set Data Type Constructor
- •6 Conclusion
- •References
- •Incorporating Weights into a Quantum-Logic-Based Query Language
- •1 Introduction
- •2 A Motivating Example
- •5 Logic-Based Weighting
- •6 Related Work
- •7 Conclusion
- •References
- •Searching for Information with Meet and Join Operators
- •1 Introduction
- •2 Background
- •2.1 Vector Spaces
- •2.2 Sets Versus Vector Spaces
- •2.3 The Boolean Model for IR
- •2.5 The Probabilistic Models
- •3 Meet and Join
- •4 Structures of a Query-by-Theme Language
- •4.1 Features and Terms
- •4.2 Themes
- •4.3 Document Ranking
- •4.4 Meet and Join Operators
- •5 Implementation of a Query-by-Theme Language
- •6 Related Work
- •7 Discussion and Future Work
- •References
- •Index
- •Preface
- •Organization
- •Contents
- •Fundamentals
- •Why Should We Use Quantum Theory?
- •1 Introduction
- •2 On the Human Science/Natural Science Issue
- •3 The Human Roots of Quantum Science
- •4 Qualitative Parallels Between Quantum Theory and the Human Sciences
- •5 Early Quantitative Applications of Quantum Theory to the Human Sciences
- •6 Epilogue
- •References
- •Quantum Cognition
- •1 Introduction
- •2 The Quantum Persuasion Approach
- •3 Experimental Design
- •3.1 Testing for Perspective Incompatibility
- •3.2 Quantum Persuasion
- •3.3 Predictions
- •4 Results
- •4.1 Descriptive Statistics
- •4.2 Data Analysis
- •4.3 Interpretation
- •5 Discussion and Concluding Remarks
- •References
- •1 Introduction
- •2 A Probabilistic Fusion Model of Trust
- •3 Contextuality
- •4 Experiment
- •4.1 Subjects
- •4.2 Design and Materials
- •4.3 Procedure
- •4.4 Results
- •4.5 Discussion
- •5 Summary and Conclusions
- •References
- •Probabilistic Programs for Investigating Contextuality in Human Information Processing
- •1 Introduction
- •2 A Framework for Determining Contextuality in Human Information Processing
- •3 Using Probabilistic Programs to Simulate Bell Scenario Experiments
- •References
- •1 Familiarity and Recollection, Verbatim and Gist
- •2 True Memory, False Memory, over Distributed Memory
- •3 The Hamiltonian Based QEM Model
- •4 Data and Prediction
- •5 Discussion
- •References
- •Decision-Making
- •1 Introduction
- •1.2 Two Stage Gambling Game
- •2 Quantum Probabilities and Waves
- •2.1 Intensity Waves
- •2.2 The Law of Balance and Probability Waves
- •2.3 Probability Waves
- •3 Law of Maximal Uncertainty
- •3.1 Principle of Entropy
- •3.2 Mirror Principle
- •4 Conclusion
- •References
- •1 Introduction
- •4 Quantum-Like Bayesian Networks
- •7.1 Results and Discussion
- •8 Conclusion
- •References
- •Cybernetics and AI
- •1 Introduction
- •2 Modeling of the Vehicle
- •2.1 Introduction to Braitenberg Vehicles
- •2.2 Quantum Approach for BV Decision Making
- •3 Topics in Eigenlogic
- •3.1 The Eigenlogic Operators
- •3.2 Incorporation of Fuzzy Logic
- •4 BV Quantum Robot Simulation Results
- •4.1 Simulation Environment
- •5 Quantum Wheel of Emotions
- •6 Discussion and Conclusion
- •7 Credits and Acknowledgements
- •References
- •1 Introduction
- •2.1 What Is Intelligence?
- •2.2 Human Intelligence and Quantum Cognition
- •2.3 In Search of the General Principles of Intelligence
- •3 Towards a Moral Test
- •4 Compositional Quantum Cognition
- •4.1 Categorical Compositional Model of Meaning
- •4.2 Proof of Concept: Compositional Quantum Cognition
- •5 Implementation of a Moral Test
- •5.2 Step II: A Toy Example, Moral Dilemmas and Context Effects
- •5.4 Step IV. Application for AI
- •6 Discussion and Conclusion
- •Appendix A: Example of a Moral Dilemma
- •References
- •Probability and Beyond
- •1 Introduction
- •2 The Theory of Density Hypercubes
- •2.1 Construction of the Theory
- •2.2 Component Symmetries
- •2.3 Normalisation and Causality
- •3 Decoherence and Hyper-decoherence
- •3.1 Decoherence to Classical Theory
- •4 Higher Order Interference
- •5 Conclusions
- •A Proofs
- •References
- •Information Retrieval
- •1 Introduction
- •2 Related Work
- •3 Quantum Entanglement and Bell Inequality
- •5 Experiment Settings
- •5.1 Dataset
- •5.3 Experimental Procedure
- •6 Results and Discussion
- •7 Conclusion
- •A Appendix
- •References
- •Investigating Bell Inequalities for Multidimensional Relevance Judgments in Information Retrieval
- •1 Introduction
- •2 Quantifying Relevance Dimensions
- •3 Deriving a Bell Inequality for Documents
- •3.1 CHSH Inequality
- •3.2 CHSH Inequality for Documents Using the Trace Method
- •4 Experiment and Results
- •5 Conclusion and Future Work
- •A Appendix
- •References
- •Short Paper
- •An Update on Updating
- •References
- •Author Index
- •The Sure Thing principle, the Disjunction Effect and the Law of Total Probability
- •Material and methods
- •Experimental results.
- •Experiment 1
- •Experiment 2
- •More versus less risk averse participants
- •Theoretical analysis
- •Shared features of the theoretical models
- •The Markov model
- •The quantum-like model
- •Logistic model
- •Theoretical model performance
- •Model comparison for risk attitude partitioning.
- •Discussion
- •Authors contributions
- •Ethical clearance
- •Funding
- •Acknowledgements
- •References
- •Markov versus quantum dynamic models of belief change during evidence monitoring
- •Results
- •Model comparisons.
- •Discussion
- •Methods
- •Participants.
- •Task.
- •Procedure.
- •Mathematical Models.
- •Acknowledgements
- •New Developments for Value-based Decisions
- •Context Effects in Preferential Choice
- •Comparison of Model Mechanisms
- •Qualitative Empirical Comparisons
- •Quantitative Empirical Comparisons
- •Neural Mechanisms of Value Accumulation
- •Neuroimaging Studies of Context Effects and Attribute-Wise Decision Processes
- •Concluding Remarks
- •Acknowledgments
- •References
- •Comparison of Markov versus quantum dynamical models of human decision making
- •CONFLICT OF INTEREST
- •Endnotes
- •FURTHER READING
- •REFERENCES
suai.ru/our-contacts |
quantum machine learning |
142 S. Gogioso and C. M. Scandolo
reversible [4, 5]. Further work has ruled out higher-order interference based on thermodynamic considerations [4, 15].
Other literature has instead focused on the analysis of specific feature that theories with higher-order interference would possess, e.g. whether they would provide any advantage in certain computational tasks [16–19]. It was also shown that theories having second-order interference and lacking interference of higher orders are relatively close to quantum theory [2, 21, 27, 28].
Unfortunately, one of the major shortcomings in the study of higher-order interference is the scarcity of concrete models displaying such post-quantum features, so that it has so far been very hard to look for specific examples of paradoxical or counter-intuitive consequences. Two models—density cubes [10] and quartic quantum theory [30]—have been proposed in the past, but are not fully defined operational theories, e.g. because they do not deal with composite systems [18]. This limitation precludes them from being used to study all possible consequences of higher order interference, including potential violation of Tsirelson’s bound.
In this article, we provide the first complete construction of a full-fledged operational theory exhibiting interference up to the fourth order. Our construction is inspired by the double-dilation construction of [29] and the higher-order CPM constructions of [12], and it is carried out in within the framework of categorical probabilistic theories [11]. The resulting theory of ‘density hypercubes’ has composite systems, exhibits higher-order interference and possesses hyperdecoherence maps [18, 20, 30]. Quantum theory, with its second-order interference, is an extension of classical theory: the latter can be recovered by decoherence, which eliminates the second-order interference e ects. Similarly, the theory of density hypercubes, with its thirdand fourth-order interference, is an extension of quantum theory: the latter can now be recovered by hyper-decoherence, which eliminates thirdand fourth-order interference e ects.
The paper is organized as follows. In Sect. 2, we define the categorical probabilistic theory of density hypercubes using the double-dilation construction. In Sect. 3, we define hyper-decoherence maps, and show that quantum theory is recovered in the Karoubi envelope. In Sect. 4, we show that density hypercubes display interference of thirdand fourth-order, but not of fifth-order and above. In Sect. 5, finally, we discuss open questions and future lines of research. Proofs of all results can be found in the Appendix.
2 The Theory of Density Hypercubes
2.1Construction of the Theory
In this section, we define the categorical probabilistic theory of density hypercubes, using a recently introduced construction known as double dilation [29].
The construction is done in two steps: first we define the category DD(fHilb), containing hyper-quantum systems and processes between them, and only in
suai.ru/our-contacts |
quantum machine learning |
Density Hypercubes, Higher Order Interference and Hyper-decoherence |
143 |
a second moment we introduce quantum and classical systems, using (hyper-)
decoherence and working in the Karoubi envelope Split(DD(fHilb)).
The double-dilation category DD(fHilb) is defined to be a symmetric monoidal subcategory of CPM(fHilb) with objects—the density hypercubes—in the form
DD(H) := H H, where H is a finite-dimensional Hilbert space and H := H H is the corresponding doubled system in the CPM category. Even though DD(fHilb) is symmetric monoidal and has its own graphical calculus, in this work we will always use the graphical calculi of CPM(fHilb) and fHilb to talk about density hypercubes. When working in CPM(fHilb), we will use solid black lines for morphisms and calligraphic letters (e.g. H) for objects. When working in fHilb, we will use solid grey lines for morphisms and plain letters (e.g. H) for objects.
The morphisms DD(H) → DD(K) in DD(fHilb) are the CP maps H H → K K taking the following form for a doubled CP map F , some auxiliary systems
E , G and some special commutative †-Frobenius algebra (henceforth known as a classical structure) on G in fHilb:
|
|
¯ |
|
|
E |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
|
|
|
|
|||||||||
|
|
|
|
|
|
|
|
|
|
|
||
H |
|
|
|
|
|
|
|
|
K |
|
||
|
F |
|
|
|
|
|
|
|
|
|
||
|
|
|
|
|
G |
(1) |
||||||
|
|
|
|
|||||||||
|
|
|
|
|
||||||||
|
|
|
|
|
G |
|
|
|
|
|||
|
|
|
|
|||||||||
H |
|
F |
|
|
|
|
|
|
|
|
K |
|
|
|
|
|
|
|
|
|
|
||||
|
|
|
|
|
E |
|
|
|
|
|
|
|
In the diagram above, F is a doubled CP map H → G K E in CPM(fHilb)—i.e. one in the form F = f f for some f : H → G K E in fHilb—and we have
¯
used F to denote the CP map obtained by inverting the tensor product ordering of inputs and outputs of f (for purely aesthetic reasons). We will always use upper-case letters (e.g. F ) to denote doubled CP maps in CPM(fHilb), lowercase letters to denote the corresponding linear maps in fHilb, and we will always write discarding maps explicitly.
Composition in DD(fHilb) is the same as composition of CP maps, while tensor product is only slightly adjusted to take into account the doubled format of our new morphisms:
|
|
¯ |
|
|
G |
¯ |
¯ |
¯ |
F |
G |
F |
|
= |
(2) |
F |
G |
F |
G
suai.ru/our-contacts |
quantum machine learning |
144 S. Gogioso and C. M. Scandolo
Just as was the case for CP maps, maps of density hypercubes can all be obtained as composition of a “doubled” map and one or two “discarding” maps:
|
|
|
¯ |
|
|
E |
|
|
|
|
|
E |
|
|
|
|
|
|
|
|
|
||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||
|
|
|
|
|
|
||||||||||||||||||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||
H |
|
|
|
K |
|||||||||||||||||||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||||||
|
F |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||
|
|
|
|
|
|
|
G |
|
|
|
|
|
|
|
|
|
|
|
|
G |
(3) |
||
|
|
|
|
|
|
|
|||||||||||||||||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||||
|
|
|
|
|
|
|
G |
|
|
|
|
|
|
|
|
|
|
|
|
G |
|
|
|
|
|
|
|
|
|
|
|||||||||||||||||
H |
|
F |
|
|
|
|
|
K |
|
|
|
|
|
|
|
|
|
|
|
|
|||
|
|
|
|
|
|
||||||||||||||||||
|
|
|
|
|
|
|
E |
|
|
|
|
|
E |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
doubled map |
|
|
|
|
|
discarding maps |
We refer to the discarding map obtained by doubling E as the “forest” and to the discarding map obtained from the classical structure as the “bridge”. The scalars of DD(fHilb) are exactly the scalars R+ of CPM(fHilb), and hence the theory of density hypercubes is probabilistic. It is furthermore convex, because the following “tree-on-a-bridge” e ects can be used to add-up maps of density hypercubes—analogously to the way ordinary discarding maps H can be used to add-up CP maps in CPM(fHilb)—by expanding them in terms of the orthonormal bases |ψx x X associated [8] with the classical structures :
H |
H |
= |
H |
Ψx† |
= |
|
|
(4) |
|
|
|
x X |
H |
Ψ † |
H |
H |
x |
||
|
|
|
2.2Component Symmetries
States in the theory DD(fHilb) take the form of fourth order tensors, an observation which prompted the choice of “density hypercubes” as a name for the theory. If (|ψx )x X is a choice of orthonormal basis for some finite-dimensional Hilbert space H, the states on DD(H) = H H in DD(fHilb) can be expanded as follows in fHilb:
|
|
|
|
|
|
|
|
|
φ |
|
ψT |
|
ψ |
|
|
|
K |
|||||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
x01 |
|
x01 |
|
||||
Φ |
|
|
|
|
|
|
K |
|
|
|
|
|
|
|
|
|
|
|
|
|
K |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||||||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||||
|
|
|
|
|
|
|
|
φ |
|
|
ψxT10 |
|
ψx10 |
|
|
|||||||
|
|
|
|
|
|
|
|
|
||||||||||||||
|
|
|
|
|
|
|
|
|
||||||||||||||
|
= |
|
|
|
|
|
|
|
|
|
|
|
(5) |
|||||||||
|
|
|
|
|
|
|
|
x00,x01,x10,x11 X |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
φ |
|
|
|
ψx†11 |
|
ψx11 |
|
|
K |
||||
|
|
|
|
|
|
|
|
|
|
|
|
|||||||||||
Φ |
|
|
|
|
|
|
K |
|||||||||||||||
|
|
|
|
|
|
|
|
φ |
|
|
|
|
ψ† |
|
ψx |
00 |
|
|
|
K |
||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||||||||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
x00 |
|
|
|
|
|
Recall that density matrices possess a Z2 symmetry given by self-adjointness. This symmetry can be understood in terms of the following action τ : Z2 → Aut(C) of Z2 on the complex numbers:
suai.ru/our-contacts |
quantum machine learning |
Density Hypercubes, Higher Order Interference and Hyper-decoherence |
145 |
|||||
τ (0) := z |
→ |
z |
τ (1) := z |
→ |
z |
(6) |
|
|
The components of a density matrix ρ then satisfy the following equation, for every a Z2 (trivial for a = 0, self-adjoint for a = 1):
τ (a)(ρ x0 x1 ) = ρx |
(0 a) |
x |
(1 a) |
(7) |
|
|
|
Instead of a Z2 symmetry, density hypercubes possess a Z2 × Z2 symmetry. This symmetry can be understood in terms of the following action τ : Z2 × Z2 → Aut(C) of Z2 × Z2 on the complex numbers:
τ(0, 0) := z → z
τ(1, 0) := z → z
τ (0, 1) := z → z
(8)
τ (1, 1) := z → z
The components of a density hypercube ρ satisfy the following equation for every (a, b) Z2 × Z2, where by we have denoted addition in Z2:
τ (a, b)(ρ x(0,0) x(0,1) x(1,0) x(1,1) ) = ρ x(0 a,0 b) x(0 a,1 b) x(1 a,0 b) x(1 a,1 b) |
(9) |
We see that the components are related by a trivial symmetry for a = (0, 0), by a self-adjoining symmetry for a = (1, 0) and a = (0, 1), and by a self-transposing symmetry in for a = (1, 1). An alternative way to look at this symmetry is observe that states of density hypercubes can all be expressed as certain sums of doubled states in the following form:
Φ |
K |
(10) |
ΦK
For these states, we have the usual self-conjugating Z2 symmetry of density matrices Φ Φ → Φ Φ as well as an independent self-transposing Z2 symmetry
Φ Φ → Φ Φ, which taken together give the same Z2 × Z2 symmetry described above in terms of components.
In order to visualise the Z2 × Z2 symmetry action, we divide the components ρx00x01x10x11 of a d-dimensional density hypercube ρ into 15 classes, depending on which indices x00, x01, x10, x11 have same/distinct values chosen from the set {1, ..., d}. We arrange the indices on a square: index 00 is on the top left corner, 10 acts as reflection about the vertical mid-line, 01 acts as reflection about the horizontal mid-line and 11 acts as 180o rotation about the centre. We use colours as names for index values in {1, ..., d}, with distinct colours denoting distinct values.
4 distinct |
3 distinct |
(11)
2 distinct |
all equal |