Amplitudes matter in their size and in their phase. When you encode your data in amplitudes, you need to consider both.
The promise of quantum computing has always been its potential to overcome the limitations of classical hardware. Hard-to-solve problems. Simulating complex molecules. Training machine learning models on astronomically large datasets.
Among these promises, amplitude coding stands out like a siren song.
Load your entire N-dimensional dataset into just log N qubits, and suddenly you have exponential data compression. This is line of Qiskit code:
qc.initialize([x0/norm, x1/norm, ..., x{N-1}/norm], qc.qubits)
It seems to give you godlike power over data. I is too tempting to believe that exponential storage automatically leads to exponential speed increases in your algorithm.
"Could that be true?"