The charCodeAt() method returns an integer between 0 and 65535 representing the UTF-16 code unit at the given index.
Example
charCodeAt() Syntax
The syntax of the charCodeAt() method is:
str.charCodeAt(index)
Here, str is a string.
charCodeAt() Parameters
The charCodeAt() method takes a single parameter:
- index - An integer between 0 and (str.length - 1).
Note: The str.length returns the length of a given string.
charCodeAt() Return Value
- Returns a number representing the UTF-16 code unit value of the character at the given index.
Notes:
- The
charCodeAt()method always returns a value less than 65,536. - If a Unicode point cannot be represented in a single UTF-16 code unit (values greater than 0xFFFF), then it returns the first part of a pair for the code point.
Example 1: Using charCodeAt() Method
Output
109 109 109
In the above example, we are using the charCodeAt() method to access the UTF-16 code unit of the character at index 5.
Since the character present at index 5 is "m", the method returns UTF-16 code unit of "m".
Similarly, for the non-integer index 5.2 and 5.9, the numbers are converted to nearest integer value i.e. 5, so the method again returns UTF-16 code unit of "m" i.e. 109.
Example 2: charCodeAt() Method for Index Out of Range
Output
NaN NaN
In the above example, we have created a string "Good morning!".
Here, both the code greeting.charCodeAt(18) and greeting.charCodeAt(-9) returns NaN because both indexes 18 and -9 are not present in the given string.
Example 3: charCodeAt() with Default Parameter
Output
71 71
In the above example, since we have not passed any parameter in charCodeAt(), the default value will be 0.
So the method returns UTF-16 code unit of character at index 0 i.e. 71.
Recommended Reading: JavaScript String fromCharCode()