chore: Test exports

This commit is contained in:
Dmitry Shirokov 2022-09-30 17:04:00 +10:00
parent efad3c3f0d
commit a1935b1ba7
No known key found for this signature in database
GPG Key ID: 8497F81BB7F0B0C0
5 changed files with 7 additions and 9 deletions

View File

@ -23,4 +23,4 @@ jobs:
- run: npm i
- run: npm test
- run: npm run build
- run: node .github/workflows/test-build.sh
- run: .github/workflows/test-build.sh

View File

@ -1,5 +1,4 @@
const assert = require('assert');
const path = require('path');
const chardet = require(process.cwd());

0
.github/workflows/test-build.sh vendored Normal file → Executable file
View File

View File

@ -1,5 +1,4 @@
import assert from 'assert';
import path from 'path';
const main = async () => {
const chardet = await import(process.cwd());

View File

@ -21,19 +21,19 @@ npm i chardet
To return the encoding with the highest confidence:
```javascript
const chardet = require('chardet');
import chardet from 'chardet';
chardet.detect(Buffer.from('hello there!'));
const encoding = chardet.detect(Buffer.from('hello there!'));
// or
chardet.detectFile('/path/to/file').then(encoding => console.log(encoding));
const encoding = await chardet.detectFile('/path/to/file');
// or
chardet.detectFileSync('/path/to/file');
const encoding = chardet.detectFileSync('/path/to/file');
```
To return the full list of possible encodings use `analyse` method.
```javascript
const chardet = require('chardet');
import chardet from 'chardet';
chardet.analyse(Buffer.from('hello there!'));
```
@ -48,7 +48,7 @@ Returned value is an array of objects sorted by confidence value in decending or
## Working with large data sets
Sometimes, when data set is huge and you want to optimize performace (in tradeoff of less accuracy),
Sometimes, when data set is huge and you want to optimize performace (with a tradeoff of less accuracy),
you can sample only first N bytes of the buffer:
```javascript