Encodes a range of characters in a specified character array into a range of bytes in a byte array, when implemented by a subclass.
[Visual Basic] MustOverride Public Function GetBytes( _ ByVal chars() As Char, _ ByVal charIndex As Integer, _ ByVal charCount As Integer, _ ByVal bytes() As Byte, _ ByVal byteIndex As Integer, _ ByVal flush As Boolean _ ) As Integer [C#] public abstract int GetBytes( char[] chars, int charIndex, int charCount, byte[] bytes, int byteIndex, bool flush ); [C++] public: virtual int GetBytes( __wchar_t* chars[], int charIndex, int charCount, unsigned char* bytes[], int byteIndex, bool flush ) = 0; [JScript] public abstract function GetBytes( chars : Char[], charIndex : int, charCount : int, bytes : Byte[], byteIndex : int, flush : Boolean ) : int;
The number of bytes stored into the byte array.
Exception Type | Condition |
---|---|
ArgumentNullException | If chars or bytes is a null reference (in Visual Basic Nothing). |
ArgumentOutOfRangeException | If charIndex, charCount or byteIndex is less than zero. |
ArgumentOutOfRangeException | If charIndex + charCount is greater than the length of chars. |
ArgumentOutOfRangeException | If byteIndex + charCount is greater than the length of bytes. |
The method encodes charCount characters from chars starting at index charIndex, storing the resulting bytes in bytes starting at index byteIndex. The encoding takes into account the state in which the encoder was left following the last call to this method. The flush parameters indicates whether the encoder should flush any shift-states and partial characters at the end of the conversion.
The GetByteCount method can be used to determine the exact number of bytes that will be produced for a specified range of characters. Alternatively, the GetMaxByteCount method of the Encoding that produced this encoder can be used to determine the maximum number of bytes that will be produced for a specified number of characters, regardless of the actual character values.