Use Table lookup for HPACK decoder (#9307)

Motivation:
Table based decoding is fast.

Modification:
Use table based decoding in HPACK decoder, inspired by
https://github.com/python-hyper/hpack/blob/master/hpack/huffman_table.py

This modifies the table to be based on integers, rather than 3-tuples of
bytes.  This is for two reasons:

1.  It's faster
2.  Using bytes makes the static intializer too big, and doesn't
compile.

Result:
Faster Huffman decoding.  This only seems to help the ascii case, the
other decoding is about the same.

Benchmarks:

```
Before:
Benchmark                     (limitToAscii)  (sensitive)  (size)   Mode  Cnt        Score       Error  Units
HpackDecoderBenchmark.decode            true         true   SMALL  thrpt   20   426293.636 ±  1444.843  ops/s
HpackDecoderBenchmark.decode            true         true  MEDIUM  thrpt   20    57843.738 ±   725.704  ops/s
HpackDecoderBenchmark.decode            true         true   LARGE  thrpt   20     3002.412 ±    16.998  ops/s
HpackDecoderBenchmark.decode            true        false   SMALL  thrpt   20   412339.400 ±  1128.394  ops/s
HpackDecoderBenchmark.decode            true        false  MEDIUM  thrpt   20    58226.870 ±   199.591  ops/s
HpackDecoderBenchmark.decode            true        false   LARGE  thrpt   20     3044.256 ±    10.675  ops/s
HpackDecoderBenchmark.decode           false         true   SMALL  thrpt   20  2082615.030 ±  5929.726  ops/s
HpackDecoderBenchmark.decode           false         true  MEDIUM  thrpt   10   571640.454 ± 26499.229  ops/s
HpackDecoderBenchmark.decode           false         true   LARGE  thrpt   20    92714.555 ±  2292.222  ops/s
HpackDecoderBenchmark.decode           false        false   SMALL  thrpt   20  1745872.421 ±  6788.840  ops/s
HpackDecoderBenchmark.decode           false        false  MEDIUM  thrpt   20   490420.323 ±  2455.431  ops/s
HpackDecoderBenchmark.decode           false        false   LARGE  thrpt   20    84536.200 ±   398.714  ops/s

After(bytes):
Benchmark                     (limitToAscii)  (sensitive)  (size)   Mode  Cnt        Score      Error  Units
HpackDecoderBenchmark.decode            true         true   SMALL  thrpt   20   472649.148 ± 7122.461  ops/s
HpackDecoderBenchmark.decode            true         true  MEDIUM  thrpt   20    66739.638 ±  341.607  ops/s
HpackDecoderBenchmark.decode            true         true   LARGE  thrpt   20     3139.773 ±   24.491  ops/s
HpackDecoderBenchmark.decode            true        false   SMALL  thrpt   20   466933.833 ± 4514.971  ops/s
HpackDecoderBenchmark.decode            true        false  MEDIUM  thrpt   20    66111.778 ±  568.326  ops/s
HpackDecoderBenchmark.decode            true        false   LARGE  thrpt   20     3143.619 ±    3.332  ops/s
HpackDecoderBenchmark.decode           false         true   SMALL  thrpt   20  2109995.177 ± 6203.143  ops/s
HpackDecoderBenchmark.decode           false         true  MEDIUM  thrpt   20   586026.055 ± 1578.550  ops/s
HpackDecoderBenchmark.decode           false        false   SMALL  thrpt   20  1775723.270 ± 4932.057  ops/s
HpackDecoderBenchmark.decode           false        false  MEDIUM  thrpt   20   493316.467 ± 1453.037  ops/s
HpackDecoderBenchmark.decode           false        false   LARGE  thrpt   10    85726.219 ±  402.573  ops/s

After(ints):
Benchmark                     (limitToAscii)  (sensitive)  (size)   Mode  Cnt        Score       Error  Units
HpackDecoderBenchmark.decode            true         true   SMALL  thrpt   20   615549.006 ±  5282.283  ops/s
HpackDecoderBenchmark.decode            true         true  MEDIUM  thrpt   20    86714.630 ±   654.489  ops/s
HpackDecoderBenchmark.decode            true         true   LARGE  thrpt   20     3984.439 ±    61.612  ops/s
HpackDecoderBenchmark.decode            true        false   SMALL  thrpt   20   602489.337 ±  5397.024  ops/s
HpackDecoderBenchmark.decode            true        false  MEDIUM  thrpt   20    88399.109 ±   241.115  ops/s
HpackDecoderBenchmark.decode            true        false   LARGE  thrpt   20     3875.729 ±   103.057  ops/s
HpackDecoderBenchmark.decode           false         true   SMALL  thrpt   20  2092165.454 ± 11918.859  ops/s
HpackDecoderBenchmark.decode           false         true  MEDIUM  thrpt   20   583465.437 ±  5452.115  ops/s
HpackDecoderBenchmark.decode           false         true   LARGE  thrpt   20    93290.061 ±   665.904  ops/s
HpackDecoderBenchmark.decode           false        false   SMALL  thrpt   20  1758402.495 ± 14677.438  ops/s
HpackDecoderBenchmark.decode           false        false  MEDIUM  thrpt   10   491598.099 ±  5029.698  ops/s
HpackDecoderBenchmark.decode           false        false   LARGE  thrpt   20    85834.290 ±   554.915  ops/s
```
This commit is contained in:
Carl Mastrangelo 2019-07-02 11:09:45 -07:00 committed by Norman Maurer
parent ced1d5b751
commit 65d8ecc3a0
18 changed files with 4763 additions and 241 deletions

View File

@ -206,6 +206,22 @@ the HTTP/2 HPACK algorithm written by Twitter. It can be obtained at:
* HOMEPAGE:
* https://github.com/twitter/hpack
This product contains a modified version of 'HPACK', a Java implementation of
the HTTP/2 HPACK algorithm written by Cory Benfield. It can be obtained at:
* LICENSE:
* license/LICENSE.hyper-hpack.txt (MIT License)
* HOMEPAGE:
* https://github.com/python-hyper/hpack/
This product contains a modified version of 'HPACK', a Java implementation of
the HTTP/2 HPACK algorithm written by Tatsuhiro Tsujikawa. It can be obtained at:
* LICENSE:
* license/LICENSE.nghttp2-hpack.txt (MIT License)
* HOMEPAGE:
* https://github.com/nghttp2/nghttp2/
This product contains a modified portion of 'Apache Commons Lang', a Java library
provides utilities for the java.lang API, which can be obtained at:

View File

@ -21,7 +21,6 @@ import io.netty.handler.codec.http2.Http2HeadersEncoder.SensitivityDetector;
import io.netty.util.internal.UnstableApi;
import static io.netty.handler.codec.http2.Http2CodecUtil.DEFAULT_HEADER_LIST_SIZE;
import static io.netty.handler.codec.http2.Http2CodecUtil.DEFAULT_INITIAL_HUFFMAN_DECODE_CAPACITY;
import static io.netty.handler.codec.http2.Http2CodecUtil.DEFAULT_MAX_RESERVED_STREAMS;
import static io.netty.handler.codec.http2.Http2PromisedRequestVerifier.ALWAYS_VERIFY;
import static io.netty.util.internal.ObjectUtil.checkPositive;
@ -64,7 +63,6 @@ import static java.util.Objects.requireNonNull;
* <li>{@link #headerSensitivityDetector(SensitivityDetector)}</li>
* <li>{@link #encoderEnforceMaxConcurrentStreams(boolean)}</li>
* <li>{@link #encoderIgnoreMaxHeaderListSize(boolean)}</li>
* <li>{@link #initialHuffmanDecodeCapacity(int)}</li>
* </ul>
*
* <h3>Exposing necessary methods in a subclass</h3>
@ -106,7 +104,6 @@ public abstract class AbstractHttp2ConnectionHandlerBuilder<T extends Http2Conne
private SensitivityDetector headerSensitivityDetector;
private Boolean encoderEnforceMaxConcurrentStreams;
private Boolean encoderIgnoreMaxHeaderListSize;
private int initialHuffmanDecodeCapacity = DEFAULT_INITIAL_HUFFMAN_DECODE_CAPACITY;
private Http2PromisedRequestVerifier promisedRequestVerifier = ALWAYS_VERIFY;
private boolean autoAckSettingsFrame = true;
@ -357,13 +354,12 @@ public abstract class AbstractHttp2ConnectionHandlerBuilder<T extends Http2Conne
}
/**
* Sets the initial size of an intermediate buffer used during HPACK huffman decoding.
* @param initialHuffmanDecodeCapacity initial size of an intermediate buffer used during HPACK huffman decoding.
* @return this.
* Does nothing, do not call.
*
* @deprecated Huffman decoding no longer depends on having a decode capacity.
*/
@Deprecated
protected B initialHuffmanDecodeCapacity(int initialHuffmanDecodeCapacity) {
enforceNonCodecConstraints("initialHuffmanDecodeCapacity");
this.initialHuffmanDecodeCapacity = checkPositive(initialHuffmanDecodeCapacity, "initialHuffmanDecodeCapacity");
return self();
}
@ -442,7 +438,7 @@ public abstract class AbstractHttp2ConnectionHandlerBuilder<T extends Http2Conne
Long maxHeaderListSize = initialSettings.maxHeaderListSize();
Http2FrameReader reader = new DefaultHttp2FrameReader(new DefaultHttp2HeadersDecoder(isValidateHeaders(),
maxHeaderListSize == null ? DEFAULT_HEADER_LIST_SIZE : maxHeaderListSize,
initialHuffmanDecodeCapacity));
/* initialHuffmanDecodeCapacity= */ -1));
Http2FrameWriter writer = encoderIgnoreMaxHeaderListSize == null ?
new DefaultHttp2FrameWriter(headerSensitivityDetector()) :
new DefaultHttp2FrameWriter(headerSensitivityDetector(), encoderIgnoreMaxHeaderListSize);

View File

@ -19,7 +19,6 @@ import io.netty.buffer.ByteBuf;
import io.netty.util.internal.UnstableApi;
import static io.netty.handler.codec.http2.Http2CodecUtil.DEFAULT_HEADER_LIST_SIZE;
import static io.netty.handler.codec.http2.Http2CodecUtil.DEFAULT_INITIAL_HUFFMAN_DECODE_CAPACITY;
import static io.netty.handler.codec.http2.Http2Error.COMPRESSION_ERROR;
import static io.netty.handler.codec.http2.Http2Error.INTERNAL_ERROR;
import static io.netty.handler.codec.http2.Http2Exception.connectionError;
@ -57,7 +56,7 @@ public class DefaultHttp2HeadersDecoder implements Http2HeadersDecoder, Http2Hea
* (which is dangerous).
*/
public DefaultHttp2HeadersDecoder(boolean validateHeaders, long maxHeaderListSize) {
this(validateHeaders, maxHeaderListSize, DEFAULT_INITIAL_HUFFMAN_DECODE_CAPACITY);
this(validateHeaders, maxHeaderListSize, /* initialHuffmanDecodeCapacity= */ -1);
}
/**
@ -67,11 +66,11 @@ public class DefaultHttp2HeadersDecoder implements Http2HeadersDecoder, Http2Hea
* This is because <a href="https://tools.ietf.org/html/rfc7540#section-6.5.1">SETTINGS_MAX_HEADER_LIST_SIZE</a>
* allows a lower than advertised limit from being enforced, and the default limit is unlimited
* (which is dangerous).
* @param initialHuffmanDecodeCapacity Size of an intermediate buffer used during huffman decode.
* @param initialHuffmanDecodeCapacity Does nothing, do not use.
*/
public DefaultHttp2HeadersDecoder(boolean validateHeaders, long maxHeaderListSize,
int initialHuffmanDecodeCapacity) {
this(validateHeaders, new HpackDecoder(maxHeaderListSize, initialHuffmanDecodeCapacity));
@Deprecated int initialHuffmanDecodeCapacity) {
this(validateHeaders, new HpackDecoder(maxHeaderListSize));
}
/**

View File

@ -90,7 +90,6 @@ final class HpackDecoder {
private static final byte READ_LITERAL_HEADER_VALUE = 9;
private final HpackDynamicTable hpackDynamicTable;
private final HpackHuffmanDecoder hpackHuffmanDecoder;
private long maxHeaderListSize;
private long maxDynamicTableSize;
private long encoderMaxDynamicTableSize;
@ -102,23 +101,21 @@ final class HpackDecoder {
* This is because <a href="https://tools.ietf.org/html/rfc7540#section-6.5.1">SETTINGS_MAX_HEADER_LIST_SIZE</a>
* allows a lower than advertised limit from being enforced, and the default limit is unlimited
* (which is dangerous).
* @param initialHuffmanDecodeCapacity Size of an intermediate buffer used during huffman decode.
*/
HpackDecoder(long maxHeaderListSize, int initialHuffmanDecodeCapacity) {
this(maxHeaderListSize, initialHuffmanDecodeCapacity, DEFAULT_HEADER_TABLE_SIZE);
HpackDecoder(long maxHeaderListSize) {
this(maxHeaderListSize, DEFAULT_HEADER_TABLE_SIZE);
}
/**
* Exposed Used for testing only! Default values used in the initial settings frame are overridden intentionally
* for testing but violate the RFC if used outside the scope of testing.
*/
HpackDecoder(long maxHeaderListSize, int initialHuffmanDecodeCapacity, int maxHeaderTableSize) {
HpackDecoder(long maxHeaderListSize, int maxHeaderTableSize) {
this.maxHeaderListSize = checkPositive(maxHeaderListSize, "maxHeaderListSize");
maxDynamicTableSize = encoderMaxDynamicTableSize = maxHeaderTableSize;
maxDynamicTableSizeChangeRequired = false;
hpackDynamicTable = new HpackDynamicTable(maxHeaderTableSize);
hpackHuffmanDecoder = new HpackHuffmanDecoder(initialHuffmanDecodeCapacity);
}
/**
@ -448,7 +445,7 @@ final class HpackDecoder {
private CharSequence readStringLiteral(ByteBuf in, int length, boolean huffmanEncoded) throws Http2Exception {
if (huffmanEncoded) {
return hpackHuffmanDecoder.decode(in, length);
return HpackHuffmanDecoder.decode(in, length);
}
byte[] buf = new byte[length];
in.readBytes(buf);

View File

@ -117,7 +117,6 @@ public final class Http2CodecUtil {
public static final int SMALLEST_MAX_CONCURRENT_STREAMS = 100;
static final int DEFAULT_MAX_RESERVED_STREAMS = SMALLEST_MAX_CONCURRENT_STREAMS;
static final int DEFAULT_MIN_ALLOCATION_CHUNK = 1024;
static final int DEFAULT_INITIAL_HUFFMAN_DECODE_CAPACITY = 32;
/**
* Calculate the threshold in bytes which should trigger a {@code GO_AWAY} if a set of headers exceeds this amount.

View File

@ -88,6 +88,7 @@ public final class Http2ConnectionHandlerBuilder
}
@Override
@Deprecated
public Http2ConnectionHandlerBuilder initialHuffmanDecodeCapacity(int initialHuffmanDecodeCapacity) {
return super.initialHuffmanDecodeCapacity(initialHuffmanDecodeCapacity);
}

View File

@ -137,6 +137,7 @@ public class Http2FrameCodecBuilder extends
}
@Override
@Deprecated
public Http2FrameCodecBuilder initialHuffmanDecodeCapacity(int initialHuffmanDecodeCapacity) {
return super.initialHuffmanDecodeCapacity(initialHuffmanDecodeCapacity);
}

View File

@ -166,6 +166,7 @@ public class Http2MultiplexCodecBuilder
}
@Override
@Deprecated
public Http2MultiplexCodecBuilder initialHuffmanDecodeCapacity(int initialHuffmanDecodeCapacity) {
return super.initialHuffmanDecodeCapacity(initialHuffmanDecodeCapacity);
}

View File

@ -80,6 +80,7 @@ public final class HttpToHttp2ConnectionHandlerBuilder extends
}
@Override
@Deprecated
public HttpToHttp2ConnectionHandlerBuilder initialHuffmanDecodeCapacity(int initialHuffmanDecodeCapacity) {
return super.initialHuffmanDecodeCapacity(initialHuffmanDecodeCapacity);
}

View File

@ -79,7 +79,7 @@ public class HpackDecoderTest {
@Before
public void setUp() {
hpackDecoder = new HpackDecoder(8192, 32);
hpackDecoder = new HpackDecoder(8192);
mockHeaders = mock(Http2Headers.class);
}

View File

@ -33,7 +33,7 @@ public class HpackEncoderTest {
@Before
public void setUp() {
hpackEncoder = new HpackEncoder();
hpackDecoder = new HpackDecoder(DEFAULT_HEADER_LIST_SIZE, 32);
hpackDecoder = new HpackDecoder(DEFAULT_HEADER_LIST_SIZE);
mockHeaders = mock(Http2Headers.class);
}

View File

@ -61,56 +61,56 @@ public class HpackHuffmanTest {
for (int i = 0; i < 4; i++) {
buf[i] = (byte) 0xFF;
}
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeIllegalPadding() throws Http2Exception {
byte[] buf = new byte[1];
buf[0] = 0x00; // '0', invalid padding
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeExtraPadding() throws Http2Exception {
byte[] buf = makeBuf(0x0f, 0xFF); // '1', 'EOS'
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeExtraPadding1byte() throws Http2Exception {
byte[] buf = makeBuf(0xFF);
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeExtraPadding2byte() throws Http2Exception {
byte[] buf = makeBuf(0x1F, 0xFF); // 'a'
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeExtraPadding3byte() throws Http2Exception {
byte[] buf = makeBuf(0x1F, 0xFF, 0xFF); // 'a'
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeExtraPadding4byte() throws Http2Exception {
byte[] buf = makeBuf(0x1F, 0xFF, 0xFF, 0xFF); // 'a'
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodeExtraPadding29bit() throws Http2Exception {
byte[] buf = makeBuf(0xFF, 0x9F, 0xFF, 0xFF, 0xFF); // '|'
decode(newHuffmanDecoder(), buf);
decode(buf);
}
@Test(expected = Http2Exception.class)
public void testDecodePartialSymbol() throws Http2Exception {
byte[] buf = makeBuf(0x52, 0xBC, 0x30, 0xFF, 0xFF, 0xFF, 0xFF); // " pFA\x00", 31 bits of padding, a.k.a. EOS
decode(newHuffmanDecoder(), buf);
decode(buf);
}
private static byte[] makeBuf(int ... bytes) {
@ -122,19 +122,19 @@ public class HpackHuffmanTest {
}
private static void roundTrip(String s) throws Http2Exception {
roundTrip(new HpackHuffmanEncoder(), newHuffmanDecoder(), s);
roundTrip(new HpackHuffmanEncoder(), s);
}
private static void roundTrip(HpackHuffmanEncoder encoder, HpackHuffmanDecoder decoder, String s)
private static void roundTrip(HpackHuffmanEncoder encoder, String s)
throws Http2Exception {
roundTrip(encoder, decoder, s.getBytes());
roundTrip(encoder, s.getBytes());
}
private static void roundTrip(byte[] buf) throws Http2Exception {
roundTrip(new HpackHuffmanEncoder(), newHuffmanDecoder(), buf);
roundTrip(new HpackHuffmanEncoder(), buf);
}
private static void roundTrip(HpackHuffmanEncoder encoder, HpackHuffmanDecoder decoder, byte[] buf)
private static void roundTrip(HpackHuffmanEncoder encoder, byte[] buf)
throws Http2Exception {
ByteBuf buffer = Unpooled.buffer();
try {
@ -142,7 +142,7 @@ public class HpackHuffmanTest {
byte[] bytes = new byte[buffer.readableBytes()];
buffer.readBytes(bytes);
byte[] actualBytes = decode(decoder, bytes);
byte[] actualBytes = decode(bytes);
Assert.assertTrue(Arrays.equals(buf, actualBytes));
} finally {
@ -150,18 +150,14 @@ public class HpackHuffmanTest {
}
}
private static byte[] decode(HpackHuffmanDecoder decoder, byte[] bytes) throws Http2Exception {
private static byte[] decode(byte[] bytes) throws Http2Exception {
ByteBuf buffer = Unpooled.wrappedBuffer(bytes);
try {
AsciiString decoded = decoder.decode(buffer, buffer.readableBytes());
AsciiString decoded = HpackHuffmanDecoder.decode(buffer, buffer.readableBytes());
Assert.assertFalse(buffer.isReadable());
return decoded.toByteArray();
} finally {
buffer.release();
}
}
private static HpackHuffmanDecoder newHuffmanDecoder() {
return new HpackHuffmanDecoder(32);
}
}

View File

@ -174,7 +174,7 @@ final class HpackTestCase {
maxHeaderTableSize = Integer.MAX_VALUE;
}
return new HpackDecoder(DEFAULT_HEADER_LIST_SIZE, 32, maxHeaderTableSize);
return new HpackDecoder(DEFAULT_HEADER_LIST_SIZE, maxHeaderTableSize);
}
private static byte[] encode(HpackEncoder hpackEncoder, List<HpackHeaderField> headers, int maxHeaderTableSize,

View File

@ -136,7 +136,7 @@ public final class Http2TestUtil {
}
public static HpackDecoder newTestDecoder(long maxHeaderListSize, long maxHeaderTableSize) throws Http2Exception {
HpackDecoder hpackDecoder = new HpackDecoder(maxHeaderListSize, 32);
HpackDecoder hpackDecoder = new HpackDecoder(maxHeaderListSize);
hpackDecoder.setMaxHeaderTableSize(maxHeaderTableSize);
return hpackDecoder;
}

View File

@ -0,0 +1,21 @@
The MIT License (MIT)
Copyright (c) 2014 Cory Benfield
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -0,0 +1,23 @@
The MIT License
Copyright (c) 2012, 2014, 2015, 2016 Tatsuhiro Tsujikawa
Copyright (c) 2012, 2014, 2015, 2016 nghttp2 contributors
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@ -72,7 +72,7 @@ public class HpackDecoderBenchmark extends AbstractMicrobenchmark {
@Benchmark
@BenchmarkMode(Mode.Throughput)
public void decode(final Blackhole bh) throws Http2Exception {
HpackDecoder hpackDecoder = new HpackDecoder(DEFAULT_HEADER_LIST_SIZE, 32);
HpackDecoder hpackDecoder = new HpackDecoder(DEFAULT_HEADER_LIST_SIZE);
@SuppressWarnings("unchecked")
Http2Headers headers =
new DefaultHttp2Headers() {