Unlock the Secrets of JavaScript’s charCodeAt() Method
What is charCodeAt() and How Does it Work?
The charCodeAt()
method takes a single parameter, an integer index, and returns a number representing the UTF-16 code unit value of the character at that index. The syntax is straightforward: str.charCodeAt(index)
. Here, str
is the string you’re working with, and index
is an integer between 0 and (str.length - 1
).
Decoding the Return Value
The charCodeAt()
method always returns a value less than 65,536. However, if a Unicode point cannot be represented in a single UTF-16 code unit (values greater than 0xFFFF), it returns the first part of a pair for the code point.
Exploring charCodeAt() with Examples
Let’s dive into some practical examples to illustrate how charCodeAt()
works.
Example 1: Accessing the UTF-16 Code Unit
var str = "Hello, world!";
console.log(str.charCodeAt(5)); // Output: 109
As expected, the method returns the UTF-16 code unit of the character “m” at index 5.
Interestingly, if we pass a non-integer index, such as 5.2 or 5.9, the numbers are converted to the nearest integer value, and the method returns the UTF-16 code unit of the character at that index.
Example 2: Out of Range Index
var greeting = "Good morning!";
console.log(greeting.charCodeAt(18)); // Output: NaN
console.log(greeting.charCodeAt(-9)); // Output: NaN
As expected, both greeting.charCodeAt(18)
and greeting.charCodeAt(-9)
return NaN
because the indexes 18 and -9 are not present in the given string.
Example 3: Default Parameter Value
var str = "Good morning!";
console.log(str.charCodeAt()); // Output: 71
In this case, the default value of 0 is used, and the method returns the UTF-16 code unit of the character at index 0, which is 71.
Conclusion
By mastering the charCodeAt()
method, you’ll unlock new possibilities for string manipulation and processing in JavaScript. Whether you’re working with Unicode characters or extracting specific code units, this powerful tool is an essential addition to your developer toolkit.