File size: 10,242 Bytes
5622e30
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
# Quantitative Mapping of Computational Boundaries

[![Paper](https://img.shields.io/badge/Paper-PDF-red)](computational_boundary_paper.pdf)
[![DOI](https://img.shields.io/badge/DOI-10.57967/hf/7067-blue)](https://doi.org/10.57967/hf/7067)
[![License](https://img.shields.io/badge/License-MIT-green.svg)](LICENSE)

**A Statistical Field Theory Approach to Phase Transitions in NP-Hard Problems**

**Author:** Zixi Li (Oz Lee)
**Affiliation:** Noesis Lab (Independent Research Group)
**Contact:** [email protected]

---

## Overview

Classical computability theory tells us that computational boundaries **exist** (halting problem, P vs NP), but it doesn't answer: **where exactly are these boundaries?**

This paper presents the first **quantitative mapping** of computational phase transitions through Monte Carlo experiments on 22,000 constraint satisfaction instances. We discover universal laws governing the solvability boundary and extend this framework to natural language via pure NLP semantics.

### Key Question

For a problem of size $L$ with constraint density $d$, what is the probability $\mu(L,d)$ of finding a solution?

**Traditional answer:** "NP-hard ⇒ exponentially hard" (asymptotic)

**Our answer:** $\mu(L,d) = \frac{1}{2}(1 - \text{erf}((d - d_c(L))/\sigma))$ where $d_c(L) = -0.0809\ln(L) + 0.501$ (exact formula)

---

## Main Contributions

### 1. Three Universal Laws

We discover three fundamental laws governing computational boundaries:

**Logarithmic Scaling Law:**
```
d_c(L) = -0.0809 ln(L) + 0.501
```
with MSE ∼ 10⁻³² (machine precision!)

**Universal Phase Transition Kernel:**
```
K(x) = 1/2 (1 - erf(x/σ))
```
with universal constant σ = 0.1007

**Self-Constraint Theory:**
```
C = 1 - λ_min/λ_max
```
Constraint strength emerges from eigenvalue spectrum of word embedding covariance—no heuristic rules needed!

### 2. Complete Prediction Formula

Combining all discoveries:

```
μ(L,d) = 1/2 (1 - erf((d - d_c(L))/0.1007))
where d_c(L) = -0.0809 ln(L) + 0.501
```

This formula predicts solvability probability for any problem instance.

### 3. Natural Language Extension

We extend the framework to arbitrary problems described in natural language:

```
μ(I,C) = 1/2 (1 - erf((C - C_c(I))/σ))
where:
  I = information complexity (from text)
  C = self-constraint strength (from embeddings)
  C_c(I) = -0.0809 I + 0.501
```

---

## Methodology: The Pea Experiment

We propose a **Monte Carlo boundary mapping** approach inspired by area estimation:

1. **Throw peas randomly** across parameter space (L, d)
2. For each point, sample N problem instances
3. Run solver and record success/failure
4. Estimate μ(L,d) = successes/N
5. Map the entire solvability landscape

**Total experiments:** 22,000 samples
**Problem sizes:** L ∈ {8, 12, 16, 24, 32, 48, 64, 96, 128, 192, 256}
**Constraint densities:** d ∈ [0.005, 0.4]

---

## Key Results

### Phase Transition Discovery

![Phase Diagram](phase_diagram.png)

Sharp transitions observed with:
- Transition width Δd ≈ 0.1
- Low density (d < 0.05): μ = 0.996 ± 0.012
- High density (d > 0.3): μ = 0.278 ± 0.102
- Transition amplitude: Δμ ≈ 0.72

### Universal Kernel Collapse

![Universal Kernel](universal_kernel_analysis.png)

All phase transition curves collapse onto a single kernel when aligned:
- Standard deviation after alignment: σ = 0.029
- Reconstruction MSE = 0.0057
- Best fit: Error function (cumulative Gaussian)

### Natural Language Predictions

| Problem | I | C_self | C_c | μ | Prediction |
|---------|---|--------|-----|---|------------|
| Sort array of numbers | 1.54 | 0.09 | 0.38 | 1.00 | ✓ Trivial |
| Hamiltonian cycle in graph | 1.82 | 0.24 | 0.35 | 0.94 | ✓ Easy |
| Sudoku with 40 givens | 2.03 | 0.35 | 0.34 | 0.41 | ✓ Hard |
| TSP + 5 required edges | 2.53 | 0.39 | 0.30 | 0.10 | ✓ Intractable |

Predictions match human intuition without running any solver!

---

## Theoretical Impact

### Connections Across Disciplines

This work reveals deep connections between:

- **Computation:** Phase transitions in solvability
- **Information Theory:** Shannon entropy and constraint budgets
- **Statistical Physics:** Landau phase transition theory
- **Geometry:** Spectral properties of embedding spaces

### Paradigm Shift

| Traditional Complexity | Our Approach |
|------------------------|--------------|
| Constructive proofs | Monte Carlo sampling |
| Asymptotic bounds | Exact μ values |
| Discrete classes (P, NP) | Continuous phase diagram |
| O(·) notation | Machine precision MSE |

### Philosophical Implications

Computability is:
- Not **binary** but **probabilistic** (μ ∈ [0,1])
- Not **qualitative** but **quantitative** (exact formulas)
- Not **symbolic** but **geometric** (embedding properties)

---

## Repository Contents

```
.
├── computational_boundary_paper.pdf       # Full paper
├── computational_boundary_paper.tex       # LaTeX source
├── README.md                              # This file
├── phase_diagram.png                      # Phase transition visualization
├── universal_kernel_analysis.png          # Universal kernel collapse
├── critical_boundary_mu50.png             # Critical boundary curve
├── multi_threshold_boundaries.png         # Multiple threshold analysis
├── tsp_phase_diagram.png                  # TSP cross-validation
└── solvability_predictor_guide.png        # Prediction framework
```

---

## Citation

If you use this work in your research, please cite:

```bibtex
@misc{oz_lee_2025,
    author       = { Oz Lee },
    title        = { Quantitative_Mapping_of_Computational_Boundaries (Revision 9dcb0f8) },
    year         = 2025,
    url          = { https://huggingface.co/datasets/OzTianlu/Quantitative_Mapping_of_Computational_Boundaries },
    doi          = { 10.57967/hf/7067 },
    publisher    = { Hugging Face }
}
```

---

## Key Findings Summary

### 1. Logarithmic Scaling (Machine Precision)

Comparison of different scaling models:

| Model | Formula | MSE |
|-------|---------|-----|
| Power law | d = 0.722 L⁻⁰·³⁹¹ | 1.53×10⁻⁴ |
| Exponential | d = 0.287 e⁻⁰·⁰⁰⁸⁷ᴸ | 3.17×10⁻⁴ |
| **Logarithmic** | **d = -0.0809 ln(L) + 0.501** | **2.62×10⁻³²** |
| Linear | d = -0.00151 L + 0.275 | 6.45×10⁻⁴ |

The logarithmic model achieves **machine precision**—unprecedented in complexity theory!

### 2. Self-Constraint Theory

Traditional keyword-based methods vs. our approach:

| Feature | Keyword Method | Self-Constraint |
|---------|----------------|-----------------|
| Keyword list | Required | ✓ Not needed |
| Domain dependence | Strong | ✓ None |
| Math foundation | Empirical | ✓ Spectral analysis |
| Physical meaning | Weak | ✓ Strong (eigenvalues) |
| Interpretability | Low | ✓ High (geometric) |

**Core insight:** Constraints are not linguistic features—they are **geometric properties** of semantic embedding spaces.

### 3. Information-Constraint Phase Diagram

Universal scaling law:
```
∂C_c/∂I = -0.0809
```

**Interpretation:** Each additional bit of information reduces constraint tolerance by 8.09%.

---

## Applications

### 1. Algorithm Selection
Predict problem difficulty before running any solver—choose appropriate algorithm based on μ prediction.

### 2. Constraint Generation
Design problem instances with target difficulty by controlling (L, d) parameters.

### 3. Complexity Estimation
Estimate computational cost from natural language problem descriptions.

### 4. Educational Tools
Visualize computational phase transitions for teaching complexity theory.

---

## Future Directions

### Theory
- Derive α, β, σ from first principles
- Prove asymptotic properties of logarithmic law
- Classify other NP problems into universality classes
- Explore quantum computation phase transitions

### Experiments
- More problem types (SAT, graph coloring, knapsack)
- Different solvers (SMT, DPLL, genetic algorithms)
- Industrial real-world instances
- Large-scale parallelization

### Applications
- Automated algorithm selection systems
- Intelligent constraint generation
- Complexity estimation APIs
- Interactive educational software

---

## Limitations

1. **Model dependence:** NLP predictions rely on sentence-transformers/all-MiniLM-L6-v2
2. **Solver baseline:** Only tested backtracking (other algorithms may differ)
3. **Problem scope:** Mainly constraint satisfaction (need more problem types)
4. **Small-size effects:** Discrete artifacts for L < 16
5. **Language:** Only validated on English text

---

## Technical Details

### Benchmark Problem: OpenXOR

A minimal NP-hard problem with:
- **Search space:** 2ⁿ (exponential)
- **Solution density:** ≈ 2⁻ᵏ for k checkpoints
- **Minimal DSL:** Only 2 operations (XOR, NOP)
- **No confounds:** Pure constraint satisfaction

### Self-Constraint Computation

For problem text T with words {w₁, ..., wₙ}:

1. Get embeddings: V = [v₁, ..., vₙ] ∈ ℝⁿˣᵈ
2. Compute covariance: Σ = Cov(V)
3. Eigenvalue decomposition: Σ = Σᵢ λᵢ uᵢuᵢᵀ
4. Extract constraint: C = 1 - λ_min/λ_max

**Physical intuition:**
- λ_min ≈ λ_max (isotropic) ⇒ unconstrained (C ≈ 0)
- λ_min ≪ λ_max (compressed) ⇒ constrained (C ≈ 1)

### Information Complexity

```
I = ln(n+1) × (1 + ln(1 + σ²_sem)) × r_unique
```

where:
- ln(n+1) = word count (problem size)
- σ²_sem = semantic diversity
- r_unique = unique word ratio (information density)

---

## Acknowledgments

We thank the "pea experiment" inspiration from Monte Carlo area estimation. This work demonstrates the power of statistical methods in theoretical computer science.

---

## License

MIT License - See LICENSE file for details

---

## Contact

For questions, collaborations, or discussions:

**Zixi Li (Oz Lee)**
Email: [email protected]
Affiliation: Noesis Lab (Independent Research Group)

---

## Related Work

- **The Incompleteness of Reasoning:** [HuggingFace Dataset](https://huggingface.co/datasets/OzTianlu/The_Incompleteness_of_Reasoning)
- Previous work on computational boundaries and reasoning limits

---

**Last Updated:** January 2025
**Version:** Revision 9dcb0f8