Es scheint, dass dieses Problem der Quadratur von Ganzzahlen / Polynomen entspricht:
1. Es ist bekannt, dass die Polynommultiplikation der ganzzahligen Multiplikation entspricht .
2. Offensichtlich haben Sie das Problem bereits auf Polynom / Integer-Quadratur reduziert.Daher ist dieses Problem höchstens so schwer wie das Quadrieren.
Jetzt werde ich die Quadratur von ganzen Zahlen auf dieses Problem reduzieren:
Angenommen, Sie hatten einen Algorithmus:
F(a⃗ )→P2(x),where P(x)=∑ai∈a⃗ xai
Dieser Algorithmus ist im Wesentlichen der Algorithmus, den Sie in Ihrer Frage anfordern. Wenn ich also einen magischen Algorithmus hätte, der dies kann, könnte ich eine Funktion erstellen , , die die ganze Zahl y quadriert ( oh ja, ich liebe mathjax: P ):SQUARE(y)y
1.:2.:3.:4.:5.:6.:7.:8.:9.:10.:11.:12.:13.:Algorithm 1 Squaringprocedure SQUARE(y):a⃗ ←()i←0while y≠0 doif y & 1 thena⃗ ←a⃗ iend ifi←i+1y←y≫1end whileP2(x)←F(a⃗ )return P2(2)end procedure▹ a⃗ starts as empty polynomial sequence▹ break y down into a polynomial of base 2▹ if lsb of y is set▹ append i to a⃗ (appending xi)▹ shift y right by one▹ obtain the squared polynomial via F(a⃗ )▹ simply sum up the polynomial
Python (test with codepad):
#/cs//q/11418/2755
def F(a):
n = len(a)
for i in range(n):
assert a[i] >= 0
# (r) => coefficient
# coefficient \cdot x^{r}
S = {}
for ai in a:
for aj in a:
r = ai + aj
if r not in S:
S[r] = 0
S[r] += 1
return list(S.items())
def SQUARE(x):
x = int(x)
a = []
i = 0
while x != 0:
if x & 1 == 1:
a += [i]
x >>= 1
i += 1
print 'a:',a
P2 = F(a)
print 'P^2:',P2
s = 0
for e,c in P2:
s += (1 << e)*c
return s
3. Thus, squaring is at most as hard as this problem.
4. Therefore, integer squaring is equivalent to this problem. (they are each at most as hard as each-other, due to (2,3,1))
Now it is unknown if integer/polynomial multiplication admits bounds better than O(nlogn); in fact the best multiplication algorithms currently all use FFT and have run-times like O(nlognloglogn) (Schönhage-Strassen algorithm) and O(nlogn2O(log∗n)) (Fürer's algorithm). Arnold Schönhage and Volker Strassen conjectured a lower bound of Ω(nlogn), and so far this seems to be holding.
This doesn't mean your use of FFT is quicker; O(nlogn) for FFT is the number of operations (I think), not the bit complexity; hence it ignores some factors of smaller multiplications; when used recursively, it would become closer to the FFT multiplication algorithms listed above (see Where is the mistake in this apparently-O(n lg n) multiplication algorithm?).
5. Now, your problem is not exactly multiplication, it is squaring. So is squaring easier? Well, it is an open problem (no for now): squaring is not known to have a faster algorithm than multiplication. If you could find a better algorithm for your problem than using multiplication; then this would likely be a breakthrough.
So as of now, the answer to both your questions is: no, as of now, all the ~O(nlogn) multiplication algorithms use FFT; and as of now squaring is as hard as multiplication. And no, unless a faster algorithm for squaring is found, or multiplication breaks the O(nlogn) barrier, your problem cannot be solved faster than O(nlogn); in fact, it cannot currently be solved in O(nlogn) either, as the best multiplication algorithm only approaches that complexity.