- Added a new convenience method to ChannelInboundstreamHandlerAdapter
- EchoServerHandler uses the new method
- DefaultChannelPipeline calls inboundByteBuffer.discardReadBytes()
when it is sure there's no memory copy involved
- Really attempt to create a queue to determine LTQ can be initialized
in runtime, and cache the result
- Remove unnecessary Class<T> parameter in createQueue()
- Remove unused createQueue(Collection)
- LocalChannel and LocalServerChannel uses it to close themselves on
shutdown
- LocalEcho example does not call close() anymore because the channels
are closed automatically on shutdown
- Exception in this case makes a user less confusing
- To reduce the overhead of filling the stack trace,
NoSuchBufferException has a public pre-constructed instance.
- This is necessary because codec framework sometimes need to support
both type of outbound buffers.
- Fixed a bug where SpdyFrameEncoder did not handle ping messages
- Reduced memory copy in codec embedder (EmbeddedChannel)
- Renamed ChannelBootstrap to Bootstrap
- Renamed ServerChannelBootstrap to ServerBootstrap
- Moved bootstrap classes to io.netty.bootstrap as before
- Moved unfoldAndAdd() to a separate utility class
- Fixed a bug in unfoldAndAdd() where it did not handle ChannelBuffer
correctly
- Previous API did not support the pipeline which contains multiple
MessageToStreamEncoders because there was no way to find the closest
outbound byte buffer. Now you always get the correct buffer even if
the handler that provides the buffer is placed distantly.
For example:
Channel -> MsgAEncoder -> MsgBEncoder -> MsgCEncoder
Msg(A|B|C)Encoder will all have access to the channel's outbound
byte buffer. Previously, it was simply impossible.
- Improved ChannelBufferHolder.toString()
- AbstractChannel.doRead() is split into two versions so that the
implementation doesn't have to validate the buffer type.
- Optimized ChannelBufferHolder a little bit
- Reduced GC related with flush future notification
- Added FlushCheckpoint and DefaultChannelFuture implements it
opportunistically
-