Compare commits

...

264 Commits

Author SHA1 Message Date
semantic-release-bot 33f0b4c6d0 chore(release): 0.1.0-develop.1 [skip ci]
# [0.1.0-develop.1](https://git.lumeweb.com/LumeWeb/relay/compare/v0.0.1...v0.1.0-develop.1) (2023-04-23)

### Bug Fixes

* change ed25519-keygen import ([7ec8bb7](7ec8bb79db))
* update import ([c7f0dd5](c7f0dd586d))

### Features

* We are triggering a dummy feature commit for all past breaking changes and prototyping ([23f9e0f](23f9e0f6bd))

### Reverts

* Revert "*Need to use dynamic import" ([d288685](d28868508b))
* Revert "*Need to return dynamic function and execute it" ([485fa98](485fa98f0e))
* Revert "*Temp disable steps, debug" ([d83af7b](d83af7bf1a))
2023-04-23 13:16:29 +00:00
Derrick Hammer 062f5c9899
ci: need to install python pip and access as pip2 2023-04-23 09:13:27 -04:00
Derrick Hammer 47f768b20b
ci: need sudo on go install, chmod it, then access it directly 2023-04-23 09:10:31 -04:00
Derrick Hammer c08e130e97
ci: ci already has yq 2023-04-23 09:02:44 -04:00
Derrick Hammer ec43f49562
ci: update publish script to install golang if needed 2023-04-23 08:46:12 -04:00
Derrick Hammer 93c110c755
ci: move setup job to be steps under the setup in node/run 2023-04-23 08:35:53 -04:00
Derrick Hammer a0d33d3880
ci: add context 2023-04-23 08:30:07 -04:00
Derrick Hammer 382761e484
ci: add credijusto/ssh orb to setup git server in known hosts 2023-04-23 08:28:18 -04:00
Derrick Hammer 188f7986b6
ci: remove github actions 2023-04-23 08:15:40 -04:00
Derrick Hammer e317a3209d
build: switch to fork for cjs 2023-04-23 08:09:37 -04:00
Derrick Hammer f1ca174022
ci: need to use patch-package to remove the files element from sodium-native so pkg does not choke 2023-04-23 07:57:34 -04:00
Derrick Hammer f7ba77e04f
ci: set job names and have release depend on build 2023-04-23 07:26:43 -04:00
Derrick Hammer e933cbe918
ci: update fingerprint 2023-04-23 07:23:32 -04:00
Derrick Hammer 20543a0612
ci: switch to cimg/node:lts 2023-04-23 07:22:19 -04:00
Derrick Hammer 52c209dbb9
ci: rewrite ci 2023-04-23 07:10:48 -04:00
Derrick Hammer 4f920d6958
ci: setup circle ci 2023-04-23 06:30:32 -04:00
Derrick Hammer 69e999aef5
ci: debug 2023-04-21 22:34:44 -04:00
Derrick Hammer 533e741c15
Revert "ci: try switching to building node"
This reverts commit 9f07b21566.
2023-04-21 21:03:39 -04:00
Derrick Hammer 9f07b21566
ci: try switching to building node 2023-04-21 20:16:55 -04:00
Derrick Hammer d5f78e4d27
build: fix compile script 2023-04-21 19:35:44 -04:00
Derrick Hammer 6c9fe8208a
ci: add exec plugin and call new publish script to deploy deb to package repo 2023-04-21 19:31:57 -04:00
Derrick Hammer 23f9e0f6bd
feat: We are triggering a dummy feature commit for all past breaking changes and prototyping 2023-04-21 18:42:49 -04:00
Derrick Hammer b39d0dd7e7
ci: change repo url to be the ssh version so that semantic-version properly handles it 2023-04-21 04:01:50 -04:00
Derrick Hammer 00837eb143
ci: add build step 2023-04-21 04:01:08 -04:00
Derrick Hammer 2dafa592ed
ci: remove unneeded env vars 2023-04-21 04:00:58 -04:00
Derrick Hammer 7b78e882d8
ci: add @semantic-release/git 2023-04-20 05:32:15 -04:00
Derrick Hammer 1350929bca
ci: update git remote command 2023-04-20 03:53:15 -04:00
Derrick Hammer 3d64870f17
cleanup: remove unused imports 2023-04-19 23:31:36 -04:00
Derrick Hammer a2055079d1
refactor: use p-defer 2023-04-19 23:29:49 -04:00
Derrick Hammer 6964c6bed2
Merge remote-tracking branch 'origin/wip' into develop 2023-04-19 23:26:50 -04:00
Derrick Hammer c1e474e025
build: setup semantic release and switch to github actions 2023-04-19 23:21:15 -04:00
Derrick Hammer 0ebd876c67
chore: remove unneeded deps 2023-04-19 23:08:26 -04:00
Derrick Hammer c7e0e23950
style: use casing in option names for readability 2023-04-19 18:42:23 -04:00
Derrick Hammer b6a6bdc97f
build: we don't need any assets added anymore for now 2023-04-19 18:40:49 -04:00
Derrick Hammer d06575e9ee
refactor: use fork of ed25519-keygen 2023-04-19 18:40:27 -04:00
Derrick Hammer ddc93ba71e
build: switch back to npm 2023-04-19 18:39:50 -04:00
Derrick Hammer 7ec8bb79db
fix: change ed25519-keygen import 2023-04-19 18:39:11 -04:00
Derrick Hammer 1bd466cc20
build: add micro-packed fork that uses cjs 2023-04-19 06:01:25 -04:00
Derrick Hammer 2e3fe286f9
chore: remove packageManager 2023-04-19 05:59:22 -04:00
Derrick Hammer 3facaddae8
chore: remove resolutions 2023-04-19 05:58:36 -04:00
Derrick Hammer abafc1c715
build: update build scripts 2023-04-19 05:33:42 -04:00
Derrick Hammer 6621f25b5e
build: add more packages to hoist 2023-04-19 05:32:57 -04:00
Derrick Hammer 6d2cffd869
build: add .npmrc 2023-04-19 05:27:29 -04:00
Derrick Hammer b3fb59d283
chore: update deps 2023-04-19 05:23:16 -04:00
Derrick Hammer c7f0dd586d
fix: update import 2023-04-19 05:16:20 -04:00
Derrick Hammer c6d4ea5a8e
refactor: switch to @lumeweb/interface-relay 2023-04-19 05:15:58 -04:00
Derrick Hammer c25bf102fe
refactor: switch to ed25519-keygen 2023-04-19 05:10:36 -04:00
Derrick Hammer 79f981d789
chore: clean up package.json 2023-04-19 05:01:00 -04:00
Derrick Hammer 0736469296
refactor: removing rpc methods that require caching 2023-04-19 04:47:48 -04:00
Derrick Hammer 69e613075e
*Removing ssl support, will delegate it to caddy 2023-04-19 02:29:11 -04:00
Derrick Hammer 671c7ad6a1
*Update cfg 2023-04-19 02:27:21 -04:00
Derrick Hammer 5439c9dc92
*Add getter for domain 2023-04-19 02:27:12 -04:00
Derrick Hammer c26be980c5
*Refactor loading of ssl 2023-04-19 02:26:51 -04:00
Derrick Hammer d5a4956be7
*Use casing for readability 2023-04-19 00:55:15 -04:00
Derrick Hammer a9366c915d
*fix typo 2023-04-19 00:54:45 -04:00
Derrick Hammer 90e0f3f2c4
*Add core.appServer.buildRoutes async event to allow plugins to register routes 2023-04-19 00:29:13 -04:00
Derrick Hammer 38d6198628
*Change core.appServerStarted to core.appServer.started 2023-04-19 00:28:33 -04:00
Derrick Hammer 053eca0cf4
*await on plugin 2023-04-19 00:08:19 -04:00
Derrick Hammer a0504443e6
*refactor to handle invalid plugins 2023-04-19 00:01:14 -04:00
Derrick Hammer 21887df639
*Ensure we are using the core namespace 2023-04-18 23:53:00 -04:00
Derrick Hammer a628a9a07f
*set config dir property name 2023-04-18 23:49:34 -04:00
Derrick Hammer 1b7ad1a896
*Emit core.appServerStarted event after the appserver is booted 2023-04-18 22:47:40 -04:00
Derrick Hammer 3ac8e38f66
*Add app getter method to api 2023-04-18 22:47:09 -04:00
Derrick Hammer 356681af35
*Update cert types 2023-04-18 22:35:28 -04:00
Derrick Hammer 714da70209
*refactor ssl support 2023-04-18 20:43:37 -04:00
Derrick Hammer 3b1b6425ae
*make the non-SSL port configurable 2023-04-18 20:28:51 -04:00
Derrick Hammer 7f1dde272a
*remove rpc cache system, as it is not needed for now. This can be reverted in the future. 2023-04-18 20:19:27 -04:00
Derrick Hammer f720f40f05
*namespace all core options 2023-04-18 20:12:01 -04:00
Derrick Hammer 2b12150d71
*Backwards compat fix to ensure Protomux is stored on the stream 2023-04-07 20:52:49 -04:00
Derrick Hammer 556373c5bc
*Ensure we use Protomux pair for the RPC service 2023-04-07 20:52:23 -04:00
Derrick Hammer a003da1606
*Remove debug logging 2023-03-29 17:16:13 -04:00
Derrick Hammer 58e95806d0
*hook on core.pluginsLoaded to ensure that we don't answer until all plugins are loaded 2023-03-29 16:23:56 -04:00
Derrick Hammer 3600fbfdcf
*await on plugin loading 2023-03-29 16:23:17 -04:00
Derrick Hammer 2d30390fa2
*fix listener callback 2023-03-18 15:07:52 -04:00
Derrick Hammer d3d0f387b6
*Access the log object directly 2023-01-13 17:53:54 -05:00
Derrick Hammer eb2a19121f
*Remove the logger object from the PluginAPI object 2023-01-13 17:53:37 -05:00
Derrick Hammer d5138c3860
*When fetching the logger in the plugin api, return a child instance of the logger instead for that plugin 2023-01-13 17:46:53 -05:00
Derrick Hammer 594e8d82a1
*Protomux.pair takes an object bag on the first argument 2023-01-13 17:42:52 -05:00
Derrick Hammer 90d8bfba63
*Add debug logging for plugin loading 2023-01-08 14:13:11 -05:00
Derrick Hammer da14bac9b2
*Update types 2023-01-07 23:42:14 -05:00
Derrick Hammer 8d52e40f20
*Add new utilities on plugin api
*add access to compact-encoding via binaryEncoding
*add access to b4a via bufferEncoding
*add crypto utility with createHash helper
2023-01-07 23:34:06 -05:00
Derrick Hammer 72c663795a
*Update relay-types 2023-01-07 23:32:33 -05:00
Derrick Hammer 74c20c6042
*Don't use getRpcServer directly 2023-01-07 23:04:36 -05:00
Derrick Hammer ce23f0a7b8
*Refactor how core plugins are loaded 2023-01-07 23:02:13 -05:00
Derrick Hammer 5c85ad2962
*Move core plugins outside modules folder 2023-01-07 22:58:06 -05:00
Derrick Hammer f8a698c656
*Remove unneeded import 2023-01-07 22:56:16 -05:00
Derrick Hammer 5dee9aeed4
*Unneeded parameter
ci/woodpecker/push/woodpecker Pipeline was successful Details
2023-01-05 20:58:39 -05:00
Derrick Hammer 2ae556ad14
*Add get_topics rpc method
ci/woodpecker/push/woodpecker Pipeline was successful Details
2023-01-05 20:55:46 -05:00
Derrick Hammer 523f2c04f0
*Add new core dht plugin that allows joining a DHT topic and getting the peers of a given topic 2023-01-05 20:51:43 -05:00
Derrick Hammer ba86b3587b
*add core.shutdown hook for bootstrap DHT
ci/woodpecker/push/woodpecker Pipeline was successful Details
2023-01-05 19:44:43 -05:00
Derrick Hammer 831612bc77
*Add an async shutdown event so that plugins can take action before the swarm is destroyed 2023-01-05 19:39:03 -05:00
Derrick Hammer 6a41daff62
*Ensure the swarm properly shuts down and un-announces itself 2023-01-05 19:36:06 -05:00
Derrick Hammer d569c5fd8b
*Add readme with required grant compliance
ci/woodpecker/push/woodpecker Pipeline was successful Details
2023-01-01 17:57:23 -05:00
Derrick Hammer 3a0ed24b9d
*update dht-flood
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-31 14:22:05 -05:00
Derrick Hammer f71d8f8685
*Private key must be full 64 bit private key 2022-12-31 13:30:16 -05:00
Derrick Hammer 2c22c88e30
*address does not fit the addressinfo type 2022-12-31 13:28:59 -05:00
Derrick Hammer 14edcdc6ce
*Bootstrap argument needs to be passed to DHT, not swarm 2022-12-30 18:27:18 -05:00
Derrick Hammer be22b771f5
*Update dht flood
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-30 16:44:02 -05:00
Derrick Hammer 1b78d0e696
*Implement a protocol manager to register protomux based protocols and add it to the plugin api 2022-12-30 01:00:45 -05:00
Derrick Hammer 90ef9e3386
*Update dht-flood 2022-12-30 00:13:39 -05:00
Derrick Hammer ca65db07a3
*Use patched micro-ed25519-hdkey 2022-12-30 00:13:11 -05:00
Derrick Hammer 1da240ecd6
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-22 14:18:38 -05:00
Derrick Hammer bcc63ae15d
*Need to pass through proxy requests to config object 2022-12-22 09:54:05 -05:00
Derrick Hammer 89b0942bc8
*Update cfg 2022-12-22 09:47:48 -05:00
Derrick Hammer 7c9a7230fe
*package updates 2022-12-22 09:47:17 -05:00
Derrick Hammer 6a310a1e1f
*Use the latest hyperswarm, 4.3.5
*Update dht to 6.4.0
2022-12-22 08:30:35 -05:00
Derrick Hammer 47705c2f12
*Add a publicly accessible bootstrap node
*Setup our bootstrap to start with our node first, and add in the default bootstrap servers
2022-12-22 08:28:58 -05:00
Derrick Hammer 24ec1e6526
*Remove default plugins list 2022-12-21 16:25:31 -05:00
Derrick Hammer af7e441c7c
*save no longer needs file extension 2022-12-21 16:12:33 -05:00
Derrick Hammer 13522532a3
*Update cfg package 2022-12-21 15:17:03 -05:00
Derrick Hammer 3d29026a22
*Add new pluginConfig api that scopes all get, set, and has methods to a plugin.${plugin} scope to access only that plugins config 2022-12-21 15:12:20 -05:00
Derrick Hammer 5a4537ffd3
*Update cfg package 2022-12-21 15:11:36 -05:00
Derrick Hammer aa2dfe0ba0
*Update api reference
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 15:38:43 -05:00
Derrick Hammer edda68506c
dummy
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-19 15:35:45 -05:00
Derrick Hammer 630dce7b05
*Update relay-types 2022-12-19 15:34:44 -05:00
Derrick Hammer 7b26856720
*Update rpc methods that require cache to only be enabled if on
*Make get_direct_peers, get_connected_peers, and get_bootstrap_info debug methods
2022-12-19 15:34:32 -05:00
Derrick Hammer af4e6155bc
*Make dht cache optional, but enabled by default 2022-12-19 15:28:22 -05:00
Derrick Hammer 1e3b0e46fe
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 14:38:31 -05:00
Derrick Hammer caa72f93fc
*Update cfg package 2022-12-19 14:38:03 -05:00
Derrick Hammer b9b9ae0afa
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 14:30:35 -05:00
Derrick Hammer 6fe25b7f17
*Update cfg package 2022-12-19 14:29:42 -05:00
Derrick Hammer 212edf184d
*More log refactoring
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-19 14:17:07 -05:00
Derrick Hammer 7b9c667749
*Routes must be defined before listen 2022-12-19 14:03:22 -05:00
Derrick Hammer d2acea6781
*Update cfg package 2022-12-19 14:01:35 -05:00
Derrick Hammer 9b9ff2118c
*export ssl manager
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 12:17:33 -05:00
Derrick Hammer b57644836f
*Path bug fix 2022-12-19 12:14:24 -05:00
Derrick Hammer c207452c65
*Update API interface to add the identity and ssl manager
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 12:08:33 -05:00
Derrick Hammer c139eb3165
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 11:58:44 -05:00
Derrick Hammer 2bc9c03ea1
*Fix cfg dep 2022-12-19 11:56:17 -05:00
Derrick Hammer b7cd9ac5e2
*More log fixes
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-19 11:47:09 -05:00
Derrick Hammer a6eef21da0
*Make log the default export
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-19 11:44:11 -05:00
Derrick Hammer 88a827276e
*Update references of log object
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-19 11:42:51 -05:00
Derrick Hammer 05223c84b8
*Update config save call 2022-12-19 11:42:32 -05:00
Derrick Hammer ccd2b4fd78
*Update use of config api
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-19 11:37:49 -05:00
Derrick Hammer bf8a3f9aa1
*Update cfg library 2022-12-19 11:36:34 -05:00
Derrick Hammer f86c924299
*Update deps 2022-12-19 08:51:52 -05:00
Derrick Hammer f164f7a6d3
*Pass child pino logger to dht-cache 2022-12-19 08:51:35 -05:00
Derrick Hammer 3fffc08d54
*switch to pino logger 2022-12-19 08:19:43 -05:00
Derrick Hammer f597afac6a
*Implement new SSl support with SSLManager class
*Rewrite relay servers to use fastify
*Remove status code server, it will be a plugin
2022-12-19 08:09:25 -05:00
Derrick Hammer ef03883605
*switch app to using fastify 2022-12-19 07:13:11 -05:00
Derrick Hammer 84b69e09af
*Update return type on getKeyPair 2022-12-19 07:12:10 -05:00
Derrick Hammer f7a696a65f
*Remove unneeded packages 2022-12-19 06:46:14 -05:00
Derrick Hammer ce5d66095f
*Remove ssl support for now
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 15:11:18 -05:00
Derrick Hammer b7fc6834eb
dummy
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-18 15:09:05 -05:00
Derrick Hammer 56673a6bc1
*remove log debugging code 2022-12-18 15:08:04 -05:00
Derrick Hammer 457398b291
*Prune out old or unneeded code, some maybe be refactored back in as plugins or in a different form
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-18 15:05:40 -05:00
Derrick Hammer 6fa5ccd49a
*Add swarm to plugin api 2022-12-18 15:01:27 -05:00
Derrick Hammer 480bfdd0d0
*Remove dns module for now, will refactor this back later
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 14:44:39 -05:00
Derrick Hammer b50b8e8ced
*Make bip44 path a constant 2022-12-18 14:00:22 -05:00
Derrick Hammer 37b0e824c4
*Update bip44 path 2022-12-18 13:58:23 -05:00
Derrick Hammer a85b20769a
*Fix valid check 2022-12-18 13:46:00 -05:00
Derrick Hammer 5fba29b3ee
*Update imports for getKeyPair and getSeed 2022-12-18 13:45:12 -05:00
Derrick Hammer 7fad293bdc
*validateMnemonic returns a boolean 2022-12-18 13:42:41 -05:00
Derrick Hammer db4b61bdd9
*Switch seed to bip39, and use bip32/bip44/slip10 2022-12-18 13:32:59 -05:00
Derrick Hammer d62ce9a4d3
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 11:43:12 -05:00
Derrick Hammer 2db95ff746
*Add plugin emit for plugins loaded 2022-12-18 11:42:50 -05:00
Derrick Hammer f849579156
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 11:09:51 -05:00
Derrick Hammer 2b5d3ef646
*Heavily refactor plugin api, remove most methods for now
*Introduce event emitter2 which will be used as part of api
*Use a proxy class to create a custom PluginAPI.registerMethod which will pass the lexical scoped plugin name
2022-12-18 11:09:29 -05:00
Derrick Hammer 523fe07028
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 09:30:55 -05:00
Derrick Hammer fe14767e5a
*Add purging cloudflare 2022-12-18 09:30:40 -05:00
Derrick Hammer d2ca086c11
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 07:18:06 -05:00
Derrick Hammer 0e39f3d658
*Missing returning rpc instance 2022-12-18 07:17:54 -05:00
Derrick Hammer f74c66ab5f
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-18 07:00:07 -05:00
Derrick Hammer 9d6a198bca
*Extract RPC setup to a utility function and ensure new streams in getRpcByPeer use it 2022-12-18 06:58:45 -05:00
Derrick Hammer d14320b9d0
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 16:01:06 -05:00
Derrick Hammer 7fa2ec9f8e
*Release lock before throwing error 2022-12-17 15:58:46 -05:00
Derrick Hammer 621230ef2c
*Update dht-cache
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 13:08:24 -05:00
Derrick Hammer b0fcc1d835
*Update dht-cache 2022-12-17 13:08:03 -05:00
Derrick Hammer b2665df70b
*Update dht-cache
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 12:33:05 -05:00
Derrick Hammer 3b1e31ea2d
*Update dht-cache 2022-12-17 12:29:14 -05:00
Derrick Hammer 014f92342a
*Update dht-cache
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 11:38:55 -05:00
Derrick Hammer a062a522d3
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 11:05:26 -05:00
Derrick Hammer 23e1eb79ff
*Update dht-cache 2022-12-17 11:04:54 -05:00
Derrick Hammer 6445ad0c82
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 10:18:24 -05:00
Derrick Hammer 6273c634e9
*Update dht-cache 2022-12-17 10:17:36 -05:00
Derrick Hammer 01de0d586b
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 09:50:07 -05:00
Derrick Hammer 06b3f03fec
*Update dht-cache 2022-12-17 09:49:43 -05:00
Derrick Hammer 1a99c3d8cd
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-17 09:17:08 -05:00
Derrick Hammer 8cf2770f42
*Update dht-cache 2022-12-17 09:08:02 -05:00
Derrick Hammer 1c57d0af4a
*Update dht-cache
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 13:56:32 -05:00
Derrick Hammer 38c85468a4
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 11:41:58 -05:00
Derrick Hammer 8e8039eee6
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 11:33:05 -05:00
Derrick Hammer 4867c9a986
*Update dht-cache 2022-12-16 11:32:46 -05:00
Derrick Hammer 2700630a9c
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 09:48:12 -05:00
Derrick Hammer db9ada506b
dummy
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-16 09:36:36 -05:00
Derrick Hammer d7c506757c
*Update dht-cache 2022-12-16 09:36:15 -05:00
Derrick Hammer b6f1e4ba66
*Ensure we are using node-fetch 2 2022-12-16 09:36:06 -05:00
Derrick Hammer faec15f06d
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 08:22:56 -05:00
Derrick Hammer de38a16ac1
*Bug fix 2022-12-16 08:22:38 -05:00
Derrick Hammer 4057f4d388
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 08:09:24 -05:00
Derrick Hammer 7c873db91b
*Add debug code to track log messages and store then in memory to be accessed over rpc 2022-12-16 08:05:16 -05:00
Derrick Hammer 5e4f45180e
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-16 06:55:04 -05:00
Derrick Hammer 487ba0b5bd
*Ensure we only use node-fetch 2 2022-12-16 06:52:43 -05:00
Derrick Hammer 58ff8f2f92
*Add methods get_bootstrap_info and get_connected_peers to debug
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 13:44:50 -05:00
Derrick Hammer f33e77da54
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 12:16:27 -05:00
Derrick Hammer 3736bf0e51
*Partially upgrade to yarn pnp 2022-12-15 12:04:13 -05:00
Derrick Hammer 58190128a0
*Use our fork of p-timeout for commonjs
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 07:03:14 -05:00
Derrick Hammer d28868508b
Revert "*Need to use dynamic import"
This reverts commit 6413d97c61.
2022-12-15 06:23:28 -05:00
Derrick Hammer 485fa98f0e
Revert "*Need to return dynamic function and execute it"
This reverts commit 55fa792bc9.
2022-12-15 06:23:28 -05:00
Derrick Hammer bd1226ad18
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 06:08:19 -05:00
Derrick Hammer 55fa792bc9
*Need to return dynamic function and execute it 2022-12-15 06:04:24 -05:00
Derrick Hammer 6413d97c61
*Need to use dynamic import
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 05:42:22 -05:00
Derrick Hammer 324106d54f
*Use corepack and yarn 2
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 05:22:58 -05:00
Derrick Hammer 884ba62bde
*Add timeout support to broadcast request with a default timeout of 5 seconds.
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-15 04:58:54 -05:00
Derrick Hammer cd2f3a415c
dummy
ci/woodpecker/push/woodpecker Pipeline was successful Details
ci/woodpecker/manual/woodpecker Pipeline failed Details
2022-12-13 09:49:17 -05:00
Derrick Hammer d83af7bf1a
Revert "*Temp disable steps, debug"
ci/woodpecker/push/woodpecker Pipeline was successful Details
ci/woodpecker/manual/woodpecker Pipeline failed Details
This reverts commit 15c127f26b.
2022-12-13 08:46:59 -05:00
Derrick Hammer 15c127f26b
*Temp disable steps, debug
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-13 08:46:01 -05:00
Derrick Hammer 724a0f0135
*switch plugin loading to use require and not import
ci/woodpecker/manual/woodpecker Pipeline failed Details
2022-12-13 07:23:18 -05:00
Derrick Hammer 66995f5d7f
*Lowercase secrets
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-12 09:13:55 -05:00
Derrick Hammer c6652efeb0
*Make secrets caps
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 05:43:13 -05:00
Derrick Hammer 3835cf014e
*Update lock
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 03:02:56 -05:00
Derrick Hammer aad4a075cf
*Update lock
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 02:31:23 -05:00
Derrick Hammer d54fe4666f
*Types changed with chalk
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 02:16:37 -05:00
Derrick Hammer ce713cdb15
*Downgrade chalk for commonjs support
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-07 02:09:59 -05:00
Derrick Hammer 4742c6844c
*Add log level prefix plugin with chalk formatting
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 01:57:21 -05:00
Derrick Hammer d2d9c0445c
*Start service on install and stop it on uninstall
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 01:18:12 -05:00
Derrick Hammer 83af4fb226
*Update dht-cache
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-07 01:07:45 -05:00
Derrick Hammer 80a3354866
*ensure loglevel is configurable 2022-12-07 00:46:16 -05:00
Derrick Hammer 7bd0a72113
*Move relay identity log to swarm
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 17:03:30 -05:00
Derrick Hammer c25e8b4aff
*generateSeedPhraseDeterministic returns an array, need only the 1st element
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 16:55:58 -05:00
Derrick Hammer 561be145ce
*Update cfg package
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 16:49:08 -05:00
Derrick Hammer 2a6a0bf94c
*Update cfg package
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 16:41:18 -05:00
Derrick Hammer ededa55b57
*Dont require a domain
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 16:22:28 -05:00
Derrick Hammer d8f0288370
*Update lock
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 15:57:50 -05:00
Derrick Hammer 64b80001a0
*We should no longer need to delete this copy of the dep with resolutions 2022-12-06 15:57:38 -05:00
Derrick Hammer b4e09c05cf
*add cli-progress to dev deps to prevent errors 2022-12-06 15:57:11 -05:00
Derrick Hammer 8b1e823991
*Use resolutions to force hyperswarm to v6 and node-fetch to v2 2022-12-06 15:56:54 -05:00
Derrick Hammer 67e432c345
*Use new do-cdn-purge CI plugin
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 15:17:37 -05:00
Derrick Hammer c6b136c214
*Add new build step to flush package cdn 2022-12-06 14:10:58 -05:00
Derrick Hammer eedd6e7da5
*Remove pocket config check
ci/woodpecker/push/woodpecker Pipeline was successful Details
2022-12-06 07:22:03 -05:00
Derrick Hammer 8ea5a38b83
*Remove self from peers list 2022-12-05 15:38:58 -05:00
Derrick Hammer 9e480f1cfb
*Update lock
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-05 15:22:33 -05:00
Derrick Hammer bbc9020f66
*Update get_direct_peers to filter against dhtCache online list 2022-12-05 15:22:18 -05:00
Derrick Hammer cd7b12e8b3
*add get_peers and get_direct_peers api methods to the core rpc plugin
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-04 12:04:18 -05:00
Derrick Hammer e35b602133
*Add getter for dhtCache 2022-12-04 12:03:36 -05:00
Derrick Hammer 0e6c84c566
*Make default topic hash an exported const 2022-12-04 12:03:09 -05:00
Derrick Hammer 4121e23fd9
*If the rpc call returns no value, default to a true boolean 2022-12-04 07:14:36 -05:00
Derrick Hammer a87660b678
*Update lock and package.json
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-12-04 01:28:51 -05:00
Derrick Hammer 7dff9a1ab4
*Only try to stringify the data if it is not already a string 2022-12-04 01:28:05 -05:00
Derrick Hammer 9393ffc4c1
*if cached merge in the cached item signature with the rpc response 2022-12-04 01:25:55 -05:00
Derrick Hammer c8c19b77a6
*Switch to json-stringify-deterministic 2022-12-04 01:11:05 -05:00
Derrick Hammer d7897af137
*Prevent recursive broadcast_request 2022-12-04 01:01:13 -05:00
Derrick Hammer 64611618de
*Update to use NodeCache api 2022-12-03 22:55:39 -05:00
Derrick Hammer 616b74a820
*Wrap cache delete in try/catch
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-11-28 02:06:39 -05:00
Derrick Hammer 4bb0636a8d
*Unneeded import 2022-11-28 02:04:35 -05:00
Derrick Hammer 69fd9a14ef
*Switch to node cache 2022-11-28 02:03:50 -05:00
Derrick Hammer 0387316e4f
*Refactor broadcast handling to call RPCServer.handleRequest if it is a loopback request
ci/woodpecker/push/woodpecker Pipeline failed Details
*check for request module and request method
*Bug fix processing of responses
2022-11-28 01:37:54 -05:00
Derrick Hammer 9b15c738e9
*Allow handleRequest to be publicly called
*If getMethodByRequest returns an error, treat as a request error
2022-11-28 01:35:35 -05:00
Derrick Hammer cb2299f9e8
*Buffer conversion bugfix 2022-11-28 00:31:37 -05:00
Derrick Hammer 77d72a6666
*Update lock 2022-11-28 00:24:50 -05:00
Derrick Hammer 029aab6901
*Fix type handling 2022-11-28 00:24:36 -05:00
Derrick Hammer 1c37f7809c
*If we need to wait for the connection, join the peer 2022-11-28 00:16:25 -05:00
Derrick Hammer 2f9a0c7356
*Refactor getRpcByPeer to operate in Buffers 2022-11-28 00:14:11 -05:00
Derrick Hammer 91642ea729
*Update imports 2022-11-27 18:16:08 -05:00
Derrick Hammer c5febf06d8
*Update lock 2022-11-27 18:15:08 -05:00
Derrick Hammer 8895f557e5
*getNodeQuery is not needed, for now 2022-11-27 18:12:28 -05:00
Derrick Hammer 364e628c7a
*Simplify clear_cached_item so it just needs to use broadcast_request as a proxy 2022-11-27 18:10:21 -05:00
Derrick Hammer 86ce21a4b4
*If we have a cached request, release the lock
ci/woodpecker/push/woodpecker Pipeline failed Details
2022-11-26 18:23:26 -05:00
Derrick Hammer 9ce66b15a3
*Add query hash to dht cache 2022-11-26 17:53:39 -05:00
Derrick Hammer 83b62bfdcb
*Refactor mutex lock logic 2022-11-26 17:53:16 -05:00
Derrick Hammer 5c02356595
*start swarm in boot first, before even plugins since they kickstart the rpc singleton 2022-11-26 17:13:37 -05:00
Derrick Hammer ec33e40c74
*Export swarm start
*Make swarm get non async to prevent race conditions
2022-11-26 17:13:02 -05:00
Derrick Hammer ebd09f9a52
*Bug fix signData 2022-11-26 17:11:48 -05:00
Derrick Hammer 0d5aa24b74
*privateKey needs to be secretKey 2022-11-26 14:36:44 -05:00
38 changed files with 12409 additions and 5292 deletions

37
.circleci/config.yml Normal file
View File

@ -0,0 +1,37 @@
version: 2.1
orbs:
node: circleci/node@5.1.0
ssh: credijusto/ssh@0.5.2
workflows:
release:
jobs:
- node/run:
name: build
npm-run: build
filters:
branches:
only:
- master
- develop
- /^develop-.*$/
- node/run:
name: release
npm-run: semantic-release
requires:
- build
filters:
branches:
only:
- master
- develop
- /^develop-.*$/
context:
- publish
setup:
- add_ssh_keys:
fingerprints:
- "47:cf:a1:17:d9:81:8e:c5:51:e5:53:c8:33:e4:33:b9"
- ssh/ssh-add-host:
host_url: GITEA_HOST

7
.gitignore vendored
View File

@ -1 +1,8 @@
node_modules
.yarn/*
!.yarn/cache
!.yarn/patches
!.yarn/plugins
!.yarn/releases
!.yarn/sdks
!.yarn/versions

4
.npmrc Normal file
View File

@ -0,0 +1,4 @@
public-hoist-pattern[]=udx-native
public-hoist-pattern[]=sodium-native
public-hoist-pattern[]=@hyperswarm/dht
public-hoist-pattern[]=hypercore-crypto

38
.releaserc Normal file
View File

@ -0,0 +1,38 @@
{
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/changelog",
[
"@semantic-release/exec",
{
"publishCmd": "./ci/publish.sh \"${nextRelease.version}\""
}
],
[
"@semantic-release/npm",
{
"npmPublish": false
}
],
[
"@semantic-release/git",
{
"assets": [
"package.json"
]
}
]
],
"branches": [
"master",
{
name: "develop",
prerelease: true
},
{
name: "develop-*",
prerelease: true
},
]
}

View File

@ -1,34 +0,0 @@
pipeline:
build:
image: git.lumeweb.com/lumeweb/ci-node
commands:
- yarn
- yarn build
package:
image: ghcr.io/goreleaser/nfpm
commands:
- nfpm pkg --packager deb
publish_focal:
image: git.lumeweb.com/lumeweb/aptly-publisher
settings:
apt_username:
from_secret: apt_username
apt_password:
from_secret: apt_password
repo: apt.web3relay.io
folder: ubuntu
distro: focal
gpg_password:
from_secret: gpg_password
publish_jammy:
image: git.lumeweb.com/lumeweb/aptly-publisher
settings:
apt_username:
from_secret: apt_username
apt_password:
from_secret: apt_password
repo: apt.web3relay.io
folder: ubuntu
distro: jammy
gpg_password:
from_secret: gpg_password

1
README.md Normal file
View File

@ -0,0 +1 @@
This project is supported by a [Sia Foundation](https://sia.tech) grant.

18
ci/publish.sh Executable file
View File

@ -0,0 +1,18 @@
#!/bin/bash
if ! command -v go &>/dev/null; then
sudo apt-get update && sudo apt-get install -y golang
fi
sudo go install github.com/goreleaser/nfpm/v2/cmd/nfpm@latest
sudo chmod +x /root/go/bin/nfpm
yq -i ".version=\"${1}\"" nfpm.yaml
sudo /root/go/bin/nfpm package -p deb
if ! command -v pip &>/dev/null; then
sudo apt-get update && sudo apt-get install -y python-pip
fi
pip2 install --upgrade cloudsmith-cli
cloudsmith push deb lumeweb/lume-web-relay *.deb

11581
package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,76 +1,75 @@
{
"name": "@lumeweb/relay",
"type": "commonjs",
"version": "0.1.0",
"version": "0.1.0-develop.1",
"description": "",
"main": "build/index.js",
"types": "src/types.ts",
"repository": {
"url": "gitea@git.lumeweb.com:LumeWeb/relay.git"
},
"author": {
"name": "Derrick Hammer",
"email": "contact@lumeweb.com"
},
"scripts": {
"semantic-release": "semantic-release",
"compile": "tsc",
"prebuild": "bash prebuild.sh",
"package": "pkg -c pkg.json build/index.js -t linux --public --no-native-build -C gzip",
"package-debug": "pkg -c pkg.json build/index.js -b -t linux --no-bytecode --public",
"build": "npm run compile && npm run prebuild && npm run package",
"barebuild": "npm run compile && npm run package"
"postinstall": "patch-package"
},
"dependencies": {
"@hyperswarm/dht": "^6.0.1",
"@hyperswarm/dht-relay": "^0.3.0",
"@lumeweb/cfg": "https://github.com/LumeWeb/bcfg.git",
"@lumeweb/dht-cache": "https://git.lumeweb.com/LumeWeb/dht-cache.git",
"@lumeweb/kernel-utils": "https://github.com/LumeWeb/kernel-utils.git",
"@lumeweb/pokt-rpc-endpoints": "https://github.com/LumeWeb/pokt-rpc-endpoints.git",
"@skynetlabs/skynet-nodejs": "^2.6.0",
"@solana/web3.js": "^1.47.3",
"@types/acme-client": "^3.3.0",
"@types/node": "^18.0.0",
"@types/node-cron": "^3.0.2",
"@types/ws": "^8.5.3",
"ajv": "^8.11.0",
"@fastify/websocket": "^7.2.0",
"@hyperswarm/dht-relay": "^0.4.0",
"@lumeweb/cfg": "git+https://git.lumeweb.com/LumeWeb/cfg.git",
"@lumeweb/interface-relay": "git+https://git.lumeweb.com/LumeWeb/interface-relay",
"@scure/bip39": "^1.2.0",
"@types/node": "^18.15.11",
"@types/ws": "^8.5.4",
"async-mutex": "^0.3.2",
"b4a": "^1.6.1",
"b4a": "^1.6.3",
"compact-encoding": "^2.11.0",
"date-fns": "^2.28.0",
"dotenv": "^16.0.1",
"ethers": "^5.6.9",
"express": "^4.18.1",
"fetch-blob": "https://github.com/LumeWeb/fetch-blob.git",
"hyperswarm": "^3.0.4",
"json-stable-stringify": "^1.0.1",
"libskynet": "https://github.com/LumeWeb/libskynet.git",
"libskynetnode": "https://github.com/LumeWeb/libskynetnode.git",
"loady": "https://github.com/LumeWeb/loady.git",
"loglevel": "^1.8.0",
"msgpackr": "^1.6.1",
"dotenv": "^16.0.3",
"ed25519-keygen": "github:LumeWeb/ed25519-keygen",
"ethers": "^5.7.2",
"eventemitter2": "^6.4.9",
"fastify": "^4.15.0",
"fetch-blob": "github:LumeWeb/fetch-blob",
"hyperswarm": "^4.4.0",
"json-stable-stringify": "^1.0.2",
"json-stringify-deterministic": "^1.0.8",
"loady": "github:LumeWeb/loady",
"msgpackr": "^1.8.5",
"node-cache": "^5.1.2",
"node-cron": "^3.0.1",
"node-fetch": "2",
"ordered-json": "^0.1.1",
"node-fetch": "^2.6.9",
"p-defer": "git+https://git.lumeweb.com/LumeWeb/p-defer.git",
"p-timeout": "git+https://git.lumeweb.com/LumeWeb/p-timeout.git",
"pino": "^8.11.0",
"pino-pretty": "^9.4.0",
"promise-retry": "^2.0.1",
"protomux": "^3.4.0",
"protomux": "^3.4.1",
"protomux-rpc": "^1.3.0",
"random-access-memory": "^4.1.0",
"random-key": "^0.3.2",
"slugify": "^1.6.5",
"sodium-universal": "^3.1.0"
"slugify": "^1.6.6",
"sodium-universal": "^4.0.0"
},
"devDependencies": {
"@lumeweb/relay-types": "https://git.lumeweb.com/LumeWeb/relay-types.git",
"@semantic-release/changelog": "^6.0.3",
"@semantic-release/exec": "^6.0.3",
"@semantic-release/git": "^10.0.1",
"@types/b4a": "^1.6.0",
"@types/express": "^4.17.13",
"@types/minimatch": "^3.0.5",
"@types/node-fetch": "^2.6.2",
"@types/node-fetch": "^2.6.3",
"cli-progress": "^3.12.0",
"hyper-typings": "^1.0.0",
"node-gyp": "^9.1.0",
"pkg": "^5.8.0",
"node-gyp": "^9.3.1",
"patch-package": "^6.5.1",
"pkg": "^5.8.1",
"prebuildify": "^5.0.1",
"prettier": "^2.7.1",
"rollup": "^2.77.0",
"supports-color": "https://github.com/LumeWeb/supports-color.git",
"typescript": "^4.7.4"
"prettier": "^2.8.7",
"semantic-release": "21",
"supports-color": "github:LumeWeb/supports-color",
"typescript": "^4.9.5"
}
}

View File

@ -0,0 +1,20 @@
diff --git a/node_modules/sodium-native/package.json b/node_modules/sodium-native/package.json
index bda9dd4..3a5541a 100644
--- a/node_modules/sodium-native/package.json
+++ b/node_modules/sodium-native/package.json
@@ -3,15 +3,6 @@
"version": "4.0.1",
"description": "Low level bindings for libsodium",
"main": "index.js",
- "files": [
- "index.js",
- "deps/**",
- "modules/**",
- "binding.c",
- "binding.gyp",
- "macros.h",
- "prebuilds/**"
- ],
"dependencies": {
"node-gyp-build": "^4.3.0"
},

View File

@ -1,9 +1,5 @@
{
"assets": [
"node_modules/*/build/Release/*.node",
"node_modules/libskynet",
"node_modules/libskynetnode",
"node_modules/@lumeweb"
],
"outputPath": "dist"
}

View File

@ -1,3 +1,4 @@
#!/usr/bin/env bash
systemctl enable lumeweb-relay.service
systemctl start lumeweb-relay.service

View File

@ -1,3 +1,4 @@
#!/usr/bin/env bash
systemctl stop lumeweb-relay.service
systemctl disable lumeweb-relay.service

View File

@ -1,7 +1,5 @@
#!/bin/bash
rimraf node_modules/libskynetnode/node_modules/node-fetch
for pkg in udx-native sodium-native; do
(
cd "node_modules/${pkg}" || return

View File

@ -1,14 +1,11 @@
//const require = createRequire(import.meta.url);
//import { createRequire } from "module";
// @ts-ignore
import Config from "@lumeweb/cfg";
import * as os from "os";
import * as fs from "fs";
import path from "path";
import { errorExit } from "./lib/error.js";
import log from "./log.js";
const config = new Config("lumeweb-relay");
const config = new Config("lumeweb-relay", "core.confDir");
let configDir;
@ -22,25 +19,21 @@ switch (os.platform()) {
case "linux":
default:
configDir = "/etc/lumeweb/relay/config.d";
configDir = "/etc/lumeweb/relay/conf.d";
break;
}
config.inject({
configDir,
port: 8080,
logLevel: "info",
pluginDir: path.resolve(configDir, "..", "plugins"),
plugins: ["core"],
ssl: false,
"core.confDir": configDir,
"core.port": 8080,
"core.appPort": 80,
"core.logLevel": "info",
"core.pluginDir": path.resolve(configDir, "..", "plugins"),
});
config.load({
env: true,
argv: true,
});
config.load();
configDir = config.str("configdir");
configDir = config.str("core.confDir");
if (fs.existsSync(configDir)) {
try {
@ -50,15 +43,8 @@ if (fs.existsSync(configDir)) {
}
}
config.load({
env: true,
argv: true,
});
config.load();
for (const setting of ["domain"]) {
if (!config.get(setting)) {
errorExit(`Required config option ${setting} not set`);
}
}
log.level = config.get("core.loglevel");
export default config;

View File

@ -1,30 +1,25 @@
import { start as startRpc } from "./modules/rpc.js";
import { start as startRelay } from "./modules/relay.js";
import { start as startApp } from "./modules/app";
import log from "loglevel";
import config from "./config.js";
import { loadPlugins } from "./modules/plugin.js";
import { start as startDns } from "./modules/dns.js";
import { start as startSSl } from "./modules/ssl.js";
import { generateSeedPhraseDeterministic } from "libskynet";
import * as crypto from "crypto";
import { getPluginAPI, loadPlugins } from "./modules/plugin.js";
import { start as startSwarm, get as getSwarm } from "./modules/swarm.js";
import * as bip39 from "@scure/bip39";
import { wordlist } from "@scure/bip39/wordlists/english";
log.setDefaultLevel(config.str("log-level"));
if (!config.str("seed")) {
config.saveConfigJson("account.json", {
seed: generateSeedPhraseDeterministic(
crypto.randomBytes(100).toString("hex")
),
if (!config.str("core.seed")) {
config.save("account", {
core: {
seed: bip39.generateMnemonic(wordlist),
},
});
}
async function boot() {
await startSwarm();
await loadPlugins();
await startApp();
await startRpc();
await startDns();
await startSSl();
await startRelay();
}
@ -33,9 +28,12 @@ boot();
process.on("uncaughtException", function (err) {
console.log(`Caught exception: ${err.message} ${err.stack}`);
});
process.on("SIGINT", function () {
async function shutdown() {
await getPluginAPI().emitAsync("core.shutdown");
await getSwarm().destroy();
process.exit();
});
process.on("SIGTERM", function () {
process.exit();
});
}
process.on("SIGINT", shutdown);
process.on("SIGTERM", shutdown);

View File

@ -1,4 +1,4 @@
import log from "loglevel";
import log from "../log.js";
export function errorExit(msg: string): void {
log.error(msg);

View File

@ -1,445 +0,0 @@
import type { Err, progressiveFetchResult } from "libskynet";
// @ts-ignore
import { SkynetClient } from "@skynetlabs/skynet-nodejs";
import type {
IndependentFileSmall,
IndependentFileSmallMetadata,
} from "@lumeweb/relay-types";
import {
addContextToErr,
blake2b,
bufToHex,
ed25519Sign,
encodePrefixedBytes,
encodeU64,
defaultPortalList,
skylinkToResolverEntryData,
encryptFileSmall,
decryptFileSmall,
entryIDToSkylink,
deriveRegistryEntryID,
taggedRegistryEntryKeys,
namespaceInode,
deriveChildSeed,
bufToB64,
} from "libskynet";
import { readRegistryEntry, progressiveFetch, upload } from "libskynetnode";
const ERR_NOT_EXISTS = "DNE";
const STD_FILENAME = "file";
async function overwriteRegistryEntry(
keypair: any,
datakey: Uint8Array,
data: Uint8Array,
revision: bigint
): Promise<null> {
return new Promise((resolve, reject) => {
if (data.length > 86) {
reject("provided data is too large to fit in a registry entry");
return;
}
let [encodedRevision, errU64] = encodeU64(revision);
if (errU64 !== null) {
reject(addContextToErr(errU64, "unable to encode revision number"));
return;
}
let datakeyHex = bufToHex(datakey);
let [encodedData, errEPB] = encodePrefixedBytes(data);
if (errEPB !== null) {
reject(addContextToErr(errEPB, "unable to encode the registry data"));
return;
}
let dataToSign = new Uint8Array(32 + 8 + data.length + 8);
dataToSign.set(datakey, 0);
dataToSign.set(encodedData, 32);
dataToSign.set(encodedRevision, 32 + 8 + data.length);
let sigHash = blake2b(dataToSign);
let [sig, errS] = ed25519Sign(sigHash, keypair.secretKey);
if (errS !== null) {
reject(addContextToErr(errS, "unable to produce signature"));
return;
}
let postBody = {
publickey: {
algorithm: "ed25519",
key: Array.from(keypair.publicKey),
},
datakey: datakeyHex,
revision: Number(revision),
data: Array.from(data),
signature: Array.from(sig),
};
let fetchOpts = {
method: "post",
body: JSON.stringify(postBody),
};
let endpoint = "/skynet/registry";
progressiveFetch(
endpoint,
fetchOpts,
defaultPortalList,
verifyRegistryWrite
).then((result: progressiveFetchResult) => {
if (result.success === true) {
resolve(null);
return;
}
reject("unable to write registry entry\n" + JSON.stringify(result));
});
});
}
async function verifyRegistryWrite(response: Response): Promise<Err> {
return new Promise((resolve) => {
if (!("status" in response)) {
resolve("response did not contain a status");
return;
}
if (response.status === 204) {
resolve(null);
return;
}
resolve("unrecognized status");
});
}
async function createIndependentFileSmall(
seed: Uint8Array,
userInode: string,
fileData: Uint8Array
): Promise<[IndependentFileSmall, Err]> {
return new Promise(async (resolve) => {
let [inode, errNI] = namespaceInode("IndependentFileSmall", userInode);
if (errNI !== null) {
resolve([{} as any, addContextToErr(errNI, "unable to namespace inode")]);
return;
}
let [keypair, dataKey, errTREK] = taggedRegistryEntryKeys(
seed,
inode,
inode
);
if (errTREK !== null) {
resolve([
{} as any,
addContextToErr(
errTREK,
"unable to get registry entry for provided inode"
),
]);
return;
}
let result;
try {
result = await readRegistryEntry(keypair.publicKey, dataKey);
} catch (e) {
result = { exists: false };
}
if (result.exists === true) {
resolve([{} as any, "exists"]);
return;
}
let encryptionKey = deriveChildSeed(seed, inode);
let metadata: IndependentFileSmallMetadata = {
largestHistoricSize: BigInt(fileData.length),
};
let revisionSeed = new Uint8Array(seed.length + 8);
revisionSeed.set(seed, 0);
let revisionKey = deriveChildSeed(revisionSeed, inode);
let revision = BigInt(revisionKey[0]) * 256n + BigInt(revisionKey[1]);
let [encryptedData, errEF] = encryptFileSmall(
encryptionKey,
inode,
revision,
metadata,
fileData,
metadata.largestHistoricSize
);
if (errEF !== null) {
resolve([{} as any, addContextToErr(errEF, "unable to encrypt file")]);
return;
}
let immutableSkylink;
try {
immutableSkylink = await upload(encryptedData, {
Filename: STD_FILENAME,
});
} catch (e) {
resolve([{} as any, addContextToErr(e, "upload failed")]);
return;
}
let [entryData, errSTRED] = skylinkToResolverEntryData(immutableSkylink);
if (errSTRED !== null) {
resolve([
{} as any,
addContextToErr(
errSTRED,
"couldn't create resovler link from upload skylink"
),
]);
return;
}
try {
await overwriteRegistryEntry(keypair, dataKey, entryData, revision);
} catch (e: any) {
resolve([
{} as any,
addContextToErr(e, "could not write to registry entry"),
]);
return;
}
let [entryID, errDREID] = deriveRegistryEntryID(keypair.publicKey, dataKey);
if (errDREID !== null) {
resolve([
{} as any,
addContextToErr(errDREID, "could not compute entry id"),
]);
return;
}
let skylink = entryIDToSkylink(entryID);
let encStr = bufToB64(encryptionKey);
let viewKey = encStr + inode;
let ifile: IndependentFileSmall = {
dataKey,
fileData,
inode,
keypair,
metadata,
revision,
seed,
skylink,
viewKey,
overwriteData: function (newData: Uint8Array): Promise<Err> {
return overwriteIndependentFileSmall(ifile, newData);
},
readData: function (): Promise<[Uint8Array, Err]> {
return new Promise((resolve) => {
let data = new Uint8Array(ifile.fileData.length);
data.set(ifile.fileData, 0);
resolve([data, null]);
});
},
};
resolve([ifile, null]);
});
}
async function openIndependentFileSmall(
seed: Uint8Array,
userInode: string
): Promise<[IndependentFileSmall, Err]> {
return new Promise(async (resolve) => {
let [inode, errNI] = namespaceInode("IndependentFileSmall", userInode);
if (errNI !== null) {
resolve([{} as any, addContextToErr(errNI, "unable to namespace inode")]);
return;
}
let [keypair, dataKey, errTREK] = taggedRegistryEntryKeys(
seed,
inode,
inode
);
if (errTREK !== null) {
resolve([
{} as any,
addContextToErr(
errTREK,
"unable to get registry entry for provided inode"
),
]);
return;
}
let result;
try {
result = await readRegistryEntry(keypair.publicKey, dataKey);
} catch (e: any) {
resolve([
{} as any,
addContextToErr(e, "unable to read registry entry for file"),
]);
return;
}
if (result.exists !== true) {
resolve([{} as any, ERR_NOT_EXISTS]);
return;
}
let [entryID, errDREID] = deriveRegistryEntryID(keypair.publicKey, dataKey);
if (errDREID !== null) {
resolve([
{} as any,
addContextToErr(errDREID, "unable to derive registry entry id"),
]);
return;
}
let skylink = entryIDToSkylink(entryID);
const client = new SkynetClient("https://web3portal.com");
let encryptedData;
try {
encryptedData = await client.downloadData(skylink);
} catch (e: any) {
resolve([{} as any, addContextToErr(e, "unable to download file")]);
return;
}
let encryptionKey = deriveChildSeed(seed, inode);
let [metadata, fileData, errDF] = decryptFileSmall(
encryptionKey,
inode,
encryptedData
);
if (errDF !== null) {
resolve([{} as any, addContextToErr(errDF, "unable to decrypt file")]);
return;
}
let encStr = bufToB64(encryptionKey);
let viewKey = encStr + inode;
let ifile: IndependentFileSmall = {
dataKey,
fileData,
inode,
keypair,
metadata,
revision: result.revision!,
seed,
skylink,
viewKey,
overwriteData: function (newData: Uint8Array): Promise<Err> {
return overwriteIndependentFileSmall(ifile, newData);
},
readData: function (): Promise<[Uint8Array, Err]> {
return new Promise((resolve) => {
let data = new Uint8Array(ifile.fileData.length);
data.set(ifile.fileData, 0);
resolve([data, null]);
});
},
};
resolve([ifile, null]);
});
}
async function overwriteIndependentFileSmall(
file: IndependentFileSmall,
newData: Uint8Array
): Promise<Err> {
return new Promise(async (resolve) => {
// Create a new metadata for the file based on the current file
// metadata. Need to update the largest historic size.
let newMetadata: IndependentFileSmallMetadata = {
largestHistoricSize: BigInt(file.metadata.largestHistoricSize),
};
if (BigInt(newData.length) > newMetadata.largestHistoricSize) {
newMetadata.largestHistoricSize = BigInt(newData.length);
}
// Compute the new revision number for the file. This is done
// deterministically using the seed and the current revision number, so
// that multiple concurrent updates will end up with the same revision.
// We use a random number between 1 and 256 for our increment.
let [encodedRevision, errEU64] = encodeU64(file.revision);
if (errEU64 !== null) {
resolve(addContextToErr(errEU64, "unable to encode revision"));
return;
}
let revisionSeed = new Uint8Array(
file.seed.length + encodedRevision.length
);
revisionSeed.set(file.seed, 0);
revisionSeed.set(encodedRevision, file.seed.length);
let revisionKey = deriveChildSeed(revisionSeed, file.inode);
let newRevision = file.revision + BigInt(revisionKey[0]) + 1n;
// Get the encryption key.
let encryptionKey = deriveChildSeed(file.seed, file.inode);
// Create a new encrypted blob for the data.
//
// NOTE: Need to supply the data that would be in place after a
// successful update, which means using the new metadata and revision
// number.
let [encryptedData, errEFS] = encryptFileSmall(
encryptionKey,
file.inode,
newRevision,
newMetadata,
newData,
newMetadata.largestHistoricSize
);
if (errEFS !== null) {
resolve(addContextToErr(errEFS, "unable to encrypt updated file"));
return;
}
// Upload the data to get the immutable link.
let skylink;
try {
skylink = await upload(encryptedData, {
Filename: STD_FILENAME,
});
} catch (e) {
resolve(addContextToErr(e, "new data upload failed"));
return;
}
// Write to the registry entry.
let [entryData, errSTRED] = skylinkToResolverEntryData(skylink);
if (errSTRED !== null) {
resolve(
addContextToErr(
errSTRED,
"could not create resolver link from upload skylink"
)
);
return;
}
try {
await overwriteRegistryEntry(
file.keypair,
file.dataKey,
entryData,
newRevision
);
} catch (e: any) {
resolve(addContextToErr(e, "could not write to registry entry"));
return;
}
// File update was successful, update the file metadata.
file.revision = newRevision;
file.metadata = newMetadata;
file.fileData = newData;
resolve(null);
});
}
export {
createIndependentFileSmall,
openIndependentFileSmall,
overwriteIndependentFileSmall,
};

32
src/lib/seed.ts Normal file
View File

@ -0,0 +1,32 @@
import { HDKey } from "ed25519-keygen/dist/hdkey";
import config from "../config";
import * as bip39 from "@scure/bip39";
import { wordlist } from "@scure/bip39/wordlists/english";
import { errorExit } from "./error.js";
import b4a from "b4a";
const BIP44_PATH = "m/44'/1627'/0'/0'/0'";
export function getSeed() {
const seed = config.str("core.seed");
let valid = bip39.validateMnemonic(seed, wordlist);
if (!valid) {
errorExit("LUME_WEB_RELAY_SEED is invalid. Aborting.");
}
return bip39.mnemonicToSeedSync(seed);
}
export function getHDKey(): HDKey {
return HDKey.fromMasterSeed(getSeed()).derive(BIP44_PATH);
}
export function getKeyPair(): { publicKey: Uint8Array; secretKey: Uint8Array } {
const key = getHDKey();
return {
publicKey: key.publicKeyRaw,
secretKey: b4a.concat([key.privateKey, key.publicKeyRaw]),
};
}

View File

@ -1,17 +0,0 @@
import config from "../config";
import { seedPhraseToSeed } from "libskynet";
export function dynImport(module: string) {
return Function(`return import("${module}")`)() as Promise<any>;
}
export function getSeed(): Uint8Array {
let [seed, err] = seedPhraseToSeed(config.str("seed"));
if (err) {
console.error(err);
process.exit(1);
}
return seed;
}

9
src/log.ts Normal file
View File

@ -0,0 +1,9 @@
import pino from "pino";
import pretty from "pino-pretty";
const stream = pretty({
colorize: true,
});
const log = pino(stream);
export default log;

View File

@ -1,60 +1,30 @@
import express, { Express } from "express";
import http from "http";
import { AddressInfo } from "net";
import log from "loglevel";
import { getKeyPair } from "./swarm.js";
import log from "../log.js";
import fastify from "fastify";
import type { FastifyInstance } from "fastify";
import { getKeyPair } from "../lib/seed.js";
import config from "../config";
import { getPluginAPI } from "./plugin";
let app: Express;
let router = express.Router();
let server: http.Server;
export function getRouter(): express.Router {
return router;
}
export function setRouter(newRouter: express.Router): void {
router = newRouter;
}
let app: FastifyInstance;
export async function start() {
app = express();
server = http.createServer(app);
resetRouter();
await new Promise((resolve) => {
server.listen(80, "0.0.0.0", function () {
const address = server.address() as AddressInfo;
log.info(
"HTTP/App Server started on ",
`${address.address}:${address.port}`
);
resolve(null);
});
const keyPair = getKeyPair();
app = fastify({
logger: log.child({ module: "app-server" }),
});
app.use(function (req, res, next) {
router(req, res, next);
});
}
export function getApp(): Express {
return app;
}
export function getServer(): http.Server {
return server;
}
export function resetRouter(): void {
setRouter(newRouter());
}
function newRouter(): express.Router {
const router = express.Router();
let keyPair = getKeyPair();
router.get("/", (req, res) => {
app.get("/", (req, res) => {
res.send(Buffer.from(keyPair.publicKey).toString("hex"));
});
return router;
await getPluginAPI().emitAsync("core.appServer.buildRoutes");
await app.listen({ port: config.uint("core.appPort"), host: "0.0.0.0" });
getPluginAPI().emit("core.appServer.started");
}
export function get(): FastifyInstance {
return app;
}

View File

@ -1,59 +0,0 @@
import cron from "node-cron";
import { get as getSwarm } from "./swarm.js";
import { Buffer } from "buffer";
import { pack } from "msgpackr";
import config from "../config.js";
import log from "loglevel";
import fetch from "node-fetch";
import { overwriteRegistryEntry } from "libskynetnode";
import type { DnsProvider } from "@lumeweb/relay-types";
// @ts-ignore
import { hashDataKey } from "@lumeweb/kernel-utils";
let activeIp: string;
const REGISTRY_NODE_KEY = "lumeweb-dht-node";
let dnsProvider: DnsProvider = async (ip) => {};
export function setDnsProvider(provider: DnsProvider) {
dnsProvider = provider;
}
async function ipUpdate() {
let currentIp = await getCurrentIp();
if (activeIp && currentIp === activeIp) {
return;
}
const domain = config.str("domain");
await dnsProvider(currentIp, domain);
activeIp = currentIp;
log.info(`Updated DynDNS hostname ${domain} to ${activeIp}`);
}
export async function start() {
const swarm = (await getSwarm()) as any;
await ipUpdate();
await overwriteRegistryEntry(
swarm.dht.defaultKeyPair,
hashDataKey(REGISTRY_NODE_KEY),
pack(`${config.str("domain")}:${config.uint("port")}`)
);
log.info(
"Relay Identity is",
Buffer.from(swarm.dht.defaultKeyPair.publicKey).toString("hex")
);
cron.schedule("0 * * * *", ipUpdate);
}
async function getCurrentIp(): Promise<string> {
return await (await fetch("http://ip1.dynupdate.no-ip.com/")).text();
}

View File

@ -1,36 +1,124 @@
import config from "../config.js";
import type { RPCServer } from "./rpc/server.js";
import { getRpcServer } from "./rpc/server.js";
import type { PluginAPI, RPCMethod, Plugin } from "@lumeweb/relay-types";
import type { Plugin, RPCMethod } from "@lumeweb/interface-relay";
import slugify from "slugify";
import * as fs from "fs";
import path from "path";
import {
getSavedSsl,
getSsl,
getSslContext,
saveSSl,
setSsl,
setSSlCheck,
setSslContext,
} from "./ssl.js";
import log from "loglevel";
import { getSeed } from "../lib/util.js";
import { getRouter, resetRouter, setRouter } from "./app.js";
import {
createIndependentFileSmall,
openIndependentFileSmall,
overwriteIndependentFileSmall,
} from "../lib/file";
import { setDnsProvider } from "./dns";
import pluginRpc from "./plugins/rpc";
import pluginCore from "./plugins/core";
import type { Logger } from "pino";
let pluginApi: PluginApiManager;
import { getHDKey, getSeed } from "../lib/seed.js";
import type Config from "@lumeweb/cfg";
import EventEmitter2 from "eventemitter2";
import log from "../log.js";
import {
get as getSwarm,
getProtocolManager,
ProtocolManager,
} from "./swarm.js";
import { get as getApp } from "./app.js";
import type { HDKey } from "ed25519-keygen/dist/hdkey";
import corePlugins from "../plugins";
import Util from "./plugin/util";
let pluginAPIManager: PluginAPIManager;
let pluginAPI: PluginAPI;
const sanitizeName = (name: string) =>
slugify(name, { lower: true, strict: true });
export class PluginApiManager {
class PluginAPI extends EventEmitter2 {
private _server: RPCServer;
constructor({
config,
server,
swarm,
}: {
config: Config;
server: RPCServer;
swarm: any;
}) {
super({
wildcard: true,
verboseMemoryLeak: true,
maxListeners: 0,
});
this._config = config;
this._server = server;
this._swarm = swarm;
}
private _util: Util = new Util();
get util(): Util {
return this._util;
}
private _swarm: any;
get swarm(): any {
return this._swarm;
}
private _config: Config;
get config(): Config {
return this._config;
}
get pluginConfig(): Config {
throw new Error("not implemented and should not be called");
}
get logger(): Logger {
throw new Error("not implemented and should not be called");
}
get rpcServer(): RPCServer {
return this._server;
}
get seed(): Uint8Array {
return getSeed();
}
get identity(): HDKey {
return getHDKey();
}
get protocols(): ProtocolManager {
return getProtocolManager();
}
get app() {
return getApp();
}
public loadPlugin(
moduleName: string
): (moduleName: string) => Promise<Plugin> {
return getPluginAPIManager().loadPlugin;
}
registerMethod(methodName: string, method: RPCMethod): void {
throw new Error("not implemented and should not be called");
}
}
export function getPluginAPI(): PluginAPI {
if (!pluginAPI) {
pluginAPI = new PluginAPI({
config,
server: getRpcServer(),
swarm: getSwarm(),
});
}
return pluginAPI as PluginAPI;
}
export class PluginAPIManager {
private registeredPlugins: Map<string, Plugin> = new Map<string, Plugin>();
public async loadPlugin(moduleName: string): Promise<Plugin> {
@ -42,7 +130,7 @@ export class PluginApiManager {
const paths = [];
for (const modulePath of [`${moduleName}.js`, `${moduleName}.mjs`]) {
const fullPath = path.join(config.get("plugindir"), modulePath);
const fullPath = path.join(config.get("core.plugindir"), modulePath);
if (fs.existsSync(fullPath)) {
paths.push(fullPath);
break;
@ -54,84 +142,117 @@ export class PluginApiManager {
}
let plugin: Plugin;
let pluginPath = paths.shift();
try {
plugin = (await import(paths.shift() as string)) as Plugin;
plugin = require(pluginPath as string) as Plugin;
} catch (e) {
throw e;
}
return this.loadPluginInstance(plugin);
log.debug("Loaded plugin %s", moduleName);
const instance = await this.loadPluginInstance(plugin);
if (!instance) {
throw new Error(`Corrupt plugin found at ${pluginPath}`);
}
return instance as Plugin;
}
public async loadPluginInstance(plugin: Plugin): Promise<Plugin> {
public async loadPluginInstance(plugin: Plugin): Promise<Plugin | boolean> {
if ("default" in plugin) {
plugin = plugin?.default as Plugin;
}
if (!("name" in plugin)) {
return false;
}
plugin.name = sanitizeName(plugin.name);
this.registeredPlugins.set(plugin.name, plugin);
try {
plugin.plugin(this.getPluginAPI(plugin.name));
await plugin.plugin(
// @ts-ignore
new Proxy<PluginAPI>(getPluginAPI(), {
get(target: PluginAPI, prop: string): any {
if (prop === "registerMethod") {
return (methodName: string, method: RPCMethod): void => {
return getRpcServer().registerMethod(
plugin.name,
methodName,
method
);
};
}
if (prop === "pluginConfig") {
return new Proxy<Config>(config, {
get(target: Config, prop: string): any {
if (prop === "set") {
return (key: string, value: any): void => {
target.set(`plugin.${plugin.name}.${key}`, value);
};
}
if (prop === "get") {
return (key: string, fallback = null): any => {
return target.get(
`plugin.${plugin.name}.${key}`,
fallback
);
};
}
if (prop === "has") {
return (key: string): any => {
return target.has(`plugin.${plugin.name}.${key}`);
};
}
return (target as any)[prop];
},
});
}
if (prop === "logger") {
return log.child({ plugin: plugin.name });
}
return (target as any)[prop];
},
})
);
} catch (e) {
throw e;
}
return plugin;
}
log.debug("Initialized plugin %s", plugin.name);
private getPluginAPI(pluginName: string): PluginAPI {
return {
config,
registerMethod: (methodName: string, method: RPCMethod): void => {
getRpcServer().registerMethod(pluginName, methodName, method);
},
loadPlugin: getPluginAPI().loadPlugin.bind(getPluginAPI()),
getRpcServer,
ssl: {
setContext: setSslContext,
getContext: getSslContext,
getSaved: getSavedSsl,
set: setSsl,
get: getSsl,
save: saveSSl,
setCheck: setSSlCheck,
},
files: {
createIndependentFileSmall,
openIndependentFileSmall,
overwriteIndependentFileSmall,
},
dns: {
setProvider: setDnsProvider,
},
logger: log,
getSeed,
appRouter: {
get: getRouter,
set: setRouter,
reset: resetRouter,
},
};
return plugin;
}
}
export function getPluginAPI(): PluginApiManager {
if (!pluginApi) {
pluginApi = new PluginApiManager();
export function getPluginAPIManager(): PluginAPIManager {
if (!pluginAPIManager) {
pluginAPIManager = new PluginAPIManager();
}
return pluginApi as PluginApiManager;
return pluginAPIManager as PluginAPIManager;
}
export async function loadPlugins() {
const api = await getPluginAPI();
const apiManager = getPluginAPIManager();
api.loadPluginInstance(pluginCore);
api.loadPluginInstance(pluginRpc);
for (const plugin of [...new Set(config.array("plugins", []))] as []) {
api.loadPlugin(plugin);
for (const plugin of corePlugins) {
await apiManager.loadPluginInstance(plugin);
}
for (const plugin of [...new Set(config.array("core.plugins", []))] as []) {
await apiManager.loadPlugin(plugin);
}
getPluginAPI().emit("core.pluginsLoaded");
}

View File

@ -0,0 +1,19 @@
import Crypto from "./util/crypto";
import b4a from "b4a";
// @ts-ignore
import c from "compact-encoding";
export default class Util {
private _crypto: Crypto = new Crypto();
get crypto(): Crypto {
return this._crypto;
}
get bufferEncoding(): typeof b4a {
return b4a;
}
get binaryEncoding(): typeof c {
return c;
}
}

View File

@ -0,0 +1,14 @@
// @ts-ignore
import sodium from "sodium-universal";
import { getPluginAPI } from "../../plugin";
export default class Crypto {
createHash(data: string): Buffer {
const b4a = getPluginAPI().util.bufferEncoding;
const buffer = b4a.from(data);
let hash = b4a.allocUnsafe(32) as Buffer;
sodium.crypto_generichash(hash, buffer);
return hash;
}
}

View File

@ -1,132 +0,0 @@
import { getRpcServer } from "../rpc/server";
import {
Plugin,
PluginAPI,
RPCBroadcastRequest,
RPCBroadcastResponse,
RPCClearCacheRequest,
RPCClearCacheResponse,
RPCClearCacheResponseRelayList,
RPCRequest,
RPCResponse,
} from "@lumeweb/relay-types";
import { getRpcByPeer } from "../rpc";
async function broadcastRequest(
request: RPCRequest,
relays: string[]
): Promise<Map<string, Promise<any>>> {
const makeRequest = async (relay: string) => {
const rpc = await getRpcByPeer(relay);
return rpc.request(`${request.module}.${request.method}`, request.data);
};
let relayMap = new Map<string, Promise<any>>();
for (const relay of relays) {
relayMap.set(relay, makeRequest(relay));
}
await Promise.allSettled([...relays.values()]);
return relayMap;
}
const plugin: Plugin = {
name: "rpc",
async plugin(api: PluginAPI): Promise<void> {
api.registerMethod("get_cached_item", {
cacheable: false,
async handler(req: string): Promise<RPCResponse> {
if (typeof req !== "string") {
throw new Error("item must be a string");
}
const cache = getRpcServer().cache.data;
if (!Object.keys(cache).includes(req)) {
throw new Error("item does not exist");
}
return {
data: true,
...cache[req]?.value,
signature: cache[req]?.signature,
};
},
});
api.registerMethod("clear_cached_item", {
cacheable: false,
async handler(req: RPCClearCacheRequest): Promise<RPCClearCacheResponse> {
if (req?.relays?.length) {
let resp = await broadcastRequest(
{
module: "rpc",
method: "clear_cached_item",
data: req.request,
},
req?.relays
);
let results: RPCClearCacheResponse = {
relays: {},
data: true,
signedField: "relays",
};
for (const relay in resp) {
let ret: RPCClearCacheResponse;
try {
ret = await resp.get(relay);
} catch (e: any) {
(results.relays as RPCClearCacheResponseRelayList)[relay] = {
error: e.message,
};
}
}
return results;
}
try {
api.getRpcServer().cache.deleteItem(req.request);
} catch (e: any) {
throw e;
}
return {
data: true,
};
},
});
api.registerMethod("broadcast_request", {
cacheable: false,
async handler(req: RPCBroadcastRequest): Promise<RPCBroadcastResponse> {
if (!req?.request) {
throw new Error("request required");
}
if (!req?.relays?.length) {
throw new Error("relays required");
}
let resp = await broadcastRequest(req.request, req.relays);
const result: RPCBroadcastResponse = {
relays: {},
data: true,
signedField: "relays",
};
for (const relay in resp) {
let ret: RPCClearCacheResponse;
try {
ret = await resp.get(relay);
} catch (e: any) {
result.relays[relay] = { error: e.message };
}
}
return result;
},
});
},
};
export default plugin;

View File

@ -4,69 +4,30 @@ import DHT from "@hyperswarm/dht";
import { relay } from "@hyperswarm/dht-relay";
// @ts-ignore
import Stream from "@hyperswarm/dht-relay/ws";
import express, { Express } from "express";
import config from "../config.js";
import * as http from "http";
import * as https from "https";
import { get as getSwarm } from "./swarm.js";
import WS from "ws";
// @ts-ignore
import log from "loglevel";
import log from "../log.js";
import { AddressInfo } from "net";
// @ts-ignore
import promiseRetry from "promise-retry";
import { getSslContext } from "./ssl.js";
import fastify from "fastify";
import * as http2 from "http2";
import websocket from "@fastify/websocket";
export async function start() {
const relayPort = config.uint("port");
const dht = getSwarm();
const dht = await getSwarm();
const statusCodeServer = http.createServer(function (req, res) {
// @ts-ignore
res.writeHead(req.headers["x-status"] ?? 200, {
"Content-Type": "text/plain",
});
res.end();
let relayServer = fastify({
http2: true,
logger: log.child({ module: "relay-server" }),
});
await new Promise((resolve) => {
statusCodeServer.listen(25252, "0.0.0.0", function () {
const address = statusCodeServer.address() as AddressInfo;
log.info(
"Status Code Server started on ",
`${address.address}:${address.port}`
);
resolve(null);
});
relayServer.register(websocket);
relayServer.get("/", { websocket: true }, (connection) => {
relay(dht, new Stream(false, connection.socket));
});
let relayServer: https.Server | http.Server;
if (config.bool("ssl")) {
relayServer = https.createServer({
SNICallback(servername, cb) {
cb(null, getSslContext());
},
});
} else {
relayServer = http.createServer();
}
let wsServer = new WS.Server({ server: relayServer });
wsServer.on("connection", (socket: any) => {
relay(dht, new Stream(false, socket));
});
await new Promise((resolve) => {
relayServer.listen(relayPort, "0.0.0.0", function () {
const address = relayServer.address() as AddressInfo;
log.info(
"DHT Relay Server started on ",
`${address.address}:${address.port}`
);
resolve(null);
});
});
await relayServer.listen({ port: config.uint("core.port"), host: "0.0.0.0" });
}

View File

@ -5,35 +5,50 @@ import config from "../config.js";
import { errorExit } from "../lib/error.js";
// @ts-ignore
import stringify from "json-stable-stringify";
import { getRpcServer, RPC_PROTOCOL_SYMBOL } from "./rpc/server.js";
import {
getRpcServer,
RPC_PROTOCOL_ID,
RPC_PROTOCOL_SYMBOL,
setupStream,
} from "./rpc/server.js";
import { get as getSwarm, SecretStream } from "./swarm.js";
import b4a from "b4a";
// @ts-ignore
import Protomux from "protomux";
export async function start() {
if (!config.str("pocket-app-id") || !config.str("pocket-app-key")) {
errorExit("Please set pocket-app-id and pocket-app-key config options.");
}
(await getSwarm()).on("connection", (stream: SecretStream) =>
getRpcServer().setup(stream)
);
getSwarm().on("connection", (stream: SecretStream) => {
Protomux.from(stream).pair(
{ protocol: "protomux-rpc", id: RPC_PROTOCOL_ID },
async () => {
getRpcServer().setup(stream);
}
);
});
}
export async function getRpcByPeer(peer: string) {
const swarm = await getSwarm();
export async function getRpcByPeer(peer: Buffer | string) {
const swarm = getSwarm();
if (!b4a.isBuffer(peer)) {
peer = b4a.from(peer, "hex") as Buffer;
}
if (swarm._allConnections.has(peer)) {
return swarm._allConnections.get(peer)[RPC_PROTOCOL_SYMBOL];
}
return new Promise((resolve) => {
const listener = () => {};
swarm.on("connection", (peer: any, info: any) => {
if (info.publicKey.toString("hex") !== peer) {
const listener = (peer: any, info: any) => {
if (info.publicKey.toString("hex") !== peer.toString("hex")) {
return;
}
swarm.removeListener("connection", listener);
resolve(peer[RPC_PROTOCOL_SYMBOL]);
});
resolve(setupStream(peer));
};
swarm.on("connection", listener);
swarm.joinPeer(peer);
});
}

View File

@ -1,135 +0,0 @@
import EventEmitter from "events";
import DHTCache from "@lumeweb/dht-cache";
import {
RPCCacheData,
RPCCacheItem,
RPCRequest,
RPCResponse,
} from "@lumeweb/relay-types";
import { getRpcByPeer } from "../rpc";
import b4a from "b4a";
import { get as getSwarm } from "../swarm";
import { RPCServer } from "./server";
// @ts-ignore
import orderedJSON from "ordered-json";
// @ts-ignore
import crypto from "hypercore-crypto";
export class RPCCache extends EventEmitter {
private dhtCache?: DHTCache;
private server: RPCServer;
private _swarm?: any;
get swarm(): any {
return this._swarm;
}
private _data: RPCCacheData = {};
get data(): RPCCacheData {
return this._data;
}
constructor(server: RPCServer) {
super();
this.server = server;
this.init();
}
public async getNodeQuery(
node: string,
queryHash: string
): Promise<boolean | RPCResponse> {
if (!this.dhtCache?.peerHasItem(node, queryHash)) {
return false;
}
const rpc = await getRpcByPeer(node);
let response;
try {
response = rpc.request("rpc.get_cached_item", queryHash) as RPCCacheItem;
} catch (e: any) {
return false;
}
if (!this.verifyResponse(b4a.from(node, "hex") as Buffer, response)) {
return false;
}
return { ...response?.value };
}
public signResponse(item: RPCCacheItem): string {
const field = item.value.signedField || "data";
const updated = item.value.updated;
// @ts-ignore
const data = item.value[field];
const json = orderedJSON.stringify(data);
return this.server.signData(`${updated}${json}`);
}
public verifyResponse(pubkey: Buffer, item: RPCCacheItem): boolean | Buffer {
const field = item.value.signedField || "data";
const updated = item.value.updated;
// @ts-ignore
const data = item.value[field];
const json = orderedJSON.stringify(data);
try {
if (
!crypto.verify(
Buffer.from(`${updated}${json}`),
Buffer.from(item?.signature as string, "hex"),
pubkey
)
) {
return false;
}
} catch {
return false;
}
return true;
}
public addItem(query: RPCRequest, response: RPCResponse) {
const queryHash = RPCServer.hashQuery(query);
const clonedResponse = { ...response };
clonedResponse.updated = Date.now();
const item = {
value: clonedResponse,
signature: "",
};
item.signature = this.signResponse(item);
this._data[queryHash] = item;
}
public deleteItem(queryHash: string): boolean {
const cache = this.dhtCache?.cache;
if (!cache?.includes(queryHash)) {
throw Error("item does not exist");
}
this.dhtCache?.removeItem(queryHash);
delete this._data[queryHash];
return true;
}
private async init() {
this.dhtCache = new DHTCache(await getSwarm(), {
protocol: "lumeweb.rpccache",
});
this._swarm = await getSwarm();
}
}

View File

@ -1,10 +1,9 @@
import {
RPCCacheData,
RPCCacheItem,
RPCMethod,
RPCRequest,
RPCResponse,
} from "@lumeweb/relay-types";
} from "@lumeweb/interface-relay";
import EventEmitter from "events";
// @ts-ignore
import ProtomuxRPC from "protomux-rpc";
@ -12,18 +11,17 @@ import b4a from "b4a";
import { get as getSwarm, SecretStream } from "../swarm";
// @ts-ignore
import c from "compact-encoding";
import DHTCache from "@lumeweb/dht-cache";
// @ts-ignore
import crypto from "hypercore-crypto";
// @ts-ignore
import orderedJSON from "ordered-json";
import { Mutex } from "async-mutex";
import { RPCCache } from "./cache";
// @ts-ignore
import jsonStringify from "json-stringify-deterministic";
const sodium = require("sodium-universal");
let server: RPCServer;
const RPC_PROTOCOL_ID = b4a.from("lumeweb");
export const RPC_PROTOCOL_ID = b4a.from("lumeweb");
export const RPC_PROTOCOL_SYMBOL = Symbol.for(RPC_PROTOCOL_ID.toString());
export function getRpcServer(): RPCServer {
@ -34,6 +32,20 @@ export function getRpcServer(): RPCServer {
return server as RPCServer;
}
export function setupStream(stream: SecretStream) {
const existing = stream[RPC_PROTOCOL_SYMBOL];
if (existing) {
return existing;
}
stream[RPC_PROTOCOL_SYMBOL] = new ProtomuxRPC(stream, {
id: RPC_PROTOCOL_ID,
valueEncoding: c.json,
});
return stream[RPC_PROTOCOL_SYMBOL];
}
export class RPCServer extends EventEmitter {
private _modules: Map<string, Map<string, RPCMethod>> = new Map<
string,
@ -41,12 +53,6 @@ export class RPCServer extends EventEmitter {
>();
private pendingRequests: Map<string, Mutex> = new Map<string, Mutex>();
private _cache: RPCCache = new RPCCache(this);
get cache(): RPCCache {
return this._cache;
}
public static hashQuery(query: RPCRequest): string {
const clonedQuery: RPCRequest = {
module: query.module,
@ -56,7 +62,7 @@ export class RPCServer extends EventEmitter {
const queryHash = Buffer.allocUnsafe(32);
sodium.crypto_generichash(
queryHash,
Buffer.from(orderedJSON.stringify(clonedQuery))
Buffer.from(jsonStringify(clonedQuery))
);
return queryHash.toString("hex");
}
@ -102,16 +108,7 @@ export class RPCServer extends EventEmitter {
}
public setup(stream: SecretStream) {
const existing = stream[RPC_PROTOCOL_SYMBOL];
if (existing) return existing;
const options = {
id: RPC_PROTOCOL_ID,
valueEncoding: c.json,
};
const rpc = new ProtomuxRPC(stream, options);
stream[RPC_PROTOCOL_SYMBOL] = rpc;
const rpc = setupStream(stream);
for (const module of this._modules.keys()) {
for (const method of (
@ -129,44 +126,52 @@ export class RPCServer extends EventEmitter {
public signData(data: any): string {
let raw = data;
if (typeof data !== "string") {
raw = orderedJSON.stringify(data);
raw = jsonStringify(data);
}
return crypto
.sign(Buffer.from(raw, this._cache.swarm.keyPair.privateKey))
.sign(Buffer.from(raw), getSwarm().keyPair.secretKey)
.toString("hex");
}
private async handleRequest(request: RPCRequest) {
public async handleRequest(request: RPCRequest) {
let lockedRequest = await this.waitOnRequestLock(request);
if (lockedRequest) {
return lockedRequest;
}
let cachedRequest = this.getCachedRequest(request) as RPCCacheItem;
if (cachedRequest) {
return cachedRequest.value;
}
let method = this.getMethodByRequest(request) as RPCMethod;
let method = this.getMethodByRequest(request);
let ret;
let error;
try {
ret = (await method.handler(request.data)) as RPCResponse | any;
} catch (e) {
error = e;
if (method instanceof Error) {
error = method;
}
if (!error) {
method = method as RPCMethod;
try {
ret = (await method.handler(request.data)) as RPCResponse | any;
} catch (e) {
error = e;
}
}
if (error) {
this.getRequestLock(request)?.release();
throw error;
}
let rpcResult: RPCResponse = {};
if (ret === undefined) {
ret = {
data: true,
};
}
if (ret?.data) {
rpcResult = { ...ret };
@ -181,22 +186,11 @@ export class RPCServer extends EventEmitter {
};
}
if (method.cacheable) {
this.cache.addItem(request, rpcResult);
}
this.getRequestLock(request)?.release();
return rpcResult;
}
private getCachedRequest(request: RPCRequest): RPCCacheItem | boolean {
const req = RPCServer.hashQuery(request);
if (RPCServer.hashQuery(request) in this._cache.data) {
this._cache.data[req];
}
return false;
}
private getMethodByRequest(request: RPCRequest): Error | RPCMethod {
return this.getMethod(request.module, request.method);
}
@ -225,23 +219,35 @@ export class RPCServer extends EventEmitter {
return;
}
const reqId = RPCServer.hashQuery(request);
let lock: Mutex = this.pendingRequests.get(reqId) as Mutex;
const lockExists = !!lock;
if (!lockExists) {
lock = new Mutex();
this.pendingRequests.set(reqId, lock);
if (!this.getRequestLock(request)) {
this.createRequestLock(request);
}
const reqId = RPCServer.hashQuery(request);
const lock: Mutex = this.getRequestLock(request) as Mutex;
if (lock.isLocked()) {
await lock.waitForUnlock();
if (reqId in this._cache.data) {
return this._cache.data[reqId] as RPCCacheItem;
}
}
await lock.acquire();
}
private getRequestLock(request: RPCRequest): Mutex | null {
const reqId = RPCServer.hashQuery(request);
let lock: Mutex = this.pendingRequests.get(reqId) as Mutex;
if (!lock) {
return null;
}
return lock;
}
private createRequestLock(request: RPCRequest) {
const reqId = RPCServer.hashQuery(request);
this.pendingRequests.set(reqId, new Mutex());
}
}

View File

@ -1,151 +0,0 @@
import tls from "tls";
import {
createIndependentFileSmall,
openIndependentFileSmall,
overwriteIndependentFileSmall,
} from "../lib/file.js";
// @ts-ignore
import promiseRetry from "promise-retry";
import config from "../config.js";
import log from "loglevel";
import { getSeed } from "../lib/util.js";
import type {
IndependentFileSmall,
SavedSslData,
SslData,
} from "@lumeweb/relay-types";
let sslCtx: tls.SecureContext = tls.createSecureContext();
let sslObject: SslData = {};
let sslChecker: () => Promise<void>;
const FILE_CERT_NAME = "/lumeweb/relay/ssl.crt";
const FILE_KEY_NAME = "/lumeweb/relay/ssl.key";
export function setSslContext(context: tls.SecureContext) {
sslCtx = context;
}
export function getSslContext(): tls.SecureContext {
return sslCtx;
}
export function setSsl(
cert: IndependentFileSmall | Uint8Array,
key: IndependentFileSmall | Uint8Array
): void {
cert = (cert as IndependentFileSmall)?.fileData || cert;
key = (key as IndependentFileSmall)?.fileData || key;
sslObject.cert = cert as Uint8Array;
sslObject.key = key as Uint8Array;
setSslContext(
tls.createSecureContext({
cert: Buffer.from(cert),
key: Buffer.from(key),
})
);
}
export function getSsl(): SslData {
return sslObject;
}
export async function saveSSl(): Promise<void> {
const seed = getSeed();
log.info(`Saving SSL Certificate for ${config.str("domain")}`);
let oldCert = await getSslCert();
let cert: any = getSsl()?.cert;
if (oldCert) {
await overwriteIndependentFileSmall(
oldCert as IndependentFileSmall,
Buffer.from(cert)
);
} else {
await createIndependentFileSmall(seed, FILE_CERT_NAME, Buffer.from(cert));
}
let oldKey = await getSslKey();
let key: any = getSsl()?.key;
if (oldKey) {
await overwriteIndependentFileSmall(
oldKey as IndependentFileSmall,
Buffer.from(key)
);
} else {
await createIndependentFileSmall(seed, FILE_KEY_NAME, Buffer.from(key));
}
log.info(`Saved SSL Certificate for ${config.str("domain")}`);
}
export async function getSavedSsl(
retry = true
): Promise<boolean | SavedSslData> {
let retryOptions = retry ? {} : { retries: 0 };
let sslCert: IndependentFileSmall | boolean = false;
let sslKey: IndependentFileSmall | boolean = false;
try {
await promiseRetry(async (retry: any) => {
sslCert = await getSslCert();
if (!sslCert) {
retry();
}
}, retryOptions);
await promiseRetry(async (retry: any) => {
sslKey = await getSslKey();
if (!sslKey) {
retry();
}
}, retryOptions);
} catch {}
if (!sslCert || !sslKey) {
return false;
}
return {
cert: sslCert as IndependentFileSmall,
key: sslKey as IndependentFileSmall,
};
}
async function getSslCert(): Promise<IndependentFileSmall | boolean> {
return getSslFile(FILE_CERT_NAME);
}
async function getSslKey(): Promise<IndependentFileSmall | boolean> {
return getSslFile(FILE_KEY_NAME);
}
async function getSslFile(
name: string
): Promise<IndependentFileSmall | boolean> {
let seed = getSeed();
let [file, err] = await openIndependentFileSmall(seed, name);
if (err) {
return false;
}
return file;
}
export function setSSlCheck(checker: () => Promise<void>): void {
sslChecker = checker;
}
export function getSslCheck(): () => Promise<void> {
return sslChecker;
}
export async function start() {
if (config.bool("ssl") && getSslCheck()) {
await getSslCheck()();
}
}

View File

@ -5,54 +5,102 @@
import Hyperswarm from "hyperswarm";
// @ts-ignore
import DHT from "@hyperswarm/dht";
import config from "../config.js";
import { errorExit } from "../lib/error.js";
import {
deriveMyskyRootKeypair,
seedPhraseToSeed,
validSeedPhrase,
} from "libskynet";
// @ts-ignore
import Protomux from "protomux";
// @ts-ignore
import sodium from "sodium-universal";
import b4a from "b4a";
import log from "../log.js";
import { getKeyPair } from "../lib/seed.js";
import { getPluginAPI } from "./plugin";
const LUMEWEB = b4a.from("lumeweb");
export const LUMEWEB_TOPIC_HASH = b4a.allocUnsafe(32);
sodium.crypto_generichash(LUMEWEB_TOPIC_HASH, LUMEWEB);
export type SecretStream = any;
let node: Hyperswarm;
let protocolManager: ProtocolManager;
export function getKeyPair() {
const seed = config.str("seed");
let err = validSeedPhrase(seed);
if (err !== null) {
errorExit("LUME_WEB_RELAY_SEED is invalid. Aborting.");
}
return deriveMyskyRootKeypair(seedPhraseToSeed(seed)[0]);
}
async function start() {
export async function start() {
const keyPair = getKeyPair();
const bootstrap = DHT.bootstrapper(49737, "0.0.0.0");
await bootstrap.ready();
node = new Hyperswarm({ keyPair, dht: new DHT({ keyPair }) });
const topic = b4a.allocUnsafe(32);
sodium.crypto_generichash(topic, LUMEWEB);
const address = bootstrap.address();
node = new Hyperswarm({
keyPair,
dht: new DHT({
keyPair,
bootstrap: [{ host: address.host, port: address.port }].concat(
require("@hyperswarm/dht/lib/constants").BOOTSTRAP_NODES
),
}),
});
// @ts-ignore
await node.dht.ready();
await node.listen();
node.join(topic);
node.join(LUMEWEB_TOPIC_HASH);
getPluginAPI().on("core.shutdown", async () => {
return bootstrap.destroy();
});
log.info(
"Relay Identity is %s",
b4a.from(getKeyPair().publicKey).toString("hex")
);
return node;
}
export async function get(): Promise<Hyperswarm> {
if (!node) {
await start();
export function get(): Hyperswarm {
return node;
}
export class ProtocolManager {
private _protocols: Map<string, Function> = new Map<string, Function>();
private _swarm;
constructor(swarm: any) {
this._swarm = swarm;
this._swarm.on("connection", (peer: any) => {
if (!peer.userData) {
peer.userData = null;
}
for (const protocol of this._protocols) {
Protomux.from(peer).pair(
{ protocol: protocol[0] },
this.handler.bind(this, protocol[0], peer)
);
}
});
}
return node;
private handler(protocol: string, peer: any) {
if (this._protocols.has(protocol)) {
this._protocols.get(protocol)?.(peer, Protomux.from(peer));
}
}
public register(name: string, handler: Function): boolean {
if (this._protocols.has(name)) {
return false;
}
this._protocols.set(name, handler);
return true;
}
}
export function getProtocolManager(): ProtocolManager {
if (!protocolManager) {
protocolManager = new ProtocolManager(get());
}
return protocolManager;
}

View File

@ -1,9 +1,15 @@
import { Plugin, PluginAPI } from "@lumeweb/relay-types";
import { getRpcServer } from "../rpc/server";
import { Plugin, PluginAPI } from "@lumeweb/interface-relay";
import defer from "p-defer";
const plugin: Plugin = {
name: "core",
async plugin(api: PluginAPI): Promise<void> {
const pluginsLoaded = defer();
api.once("core.pluginsLoaded", () => {
pluginsLoaded.resolve();
});
api.registerMethod("ping", {
cacheable: false,
async handler(): Promise<any> {
@ -14,7 +20,9 @@ const plugin: Plugin = {
api.registerMethod("get_methods", {
cacheable: false,
async handler(): Promise<any> {
return api.getRpcServer().getMethods();
await pluginsLoaded.promise;
return api.rpcServer.getMethods();
},
});
},

33
src/plugins/dht.ts Normal file
View File

@ -0,0 +1,33 @@
import { Plugin, PluginAPI } from "@lumeweb/interface-relay";
import b4a from "b4a";
const plugin: Plugin = {
name: "dht",
async plugin(api: PluginAPI): Promise<void> {
api.registerMethod("join_topic", {
cacheable: false,
async handler(topic: string): Promise<void> {
if (!api.swarm._discovery.has(topic)) {
api.swarm.join(topic);
}
},
});
api.registerMethod("get_topic_peers", {
cacheable: false,
async handler(topic: string): Promise<string[]> {
return [...api.swarm.peers.values()]
.filter((peerInfo) => peerInfo._seenTopics.has(topic))
.map((peerInfo) => b4a.from(peerInfo.publicKey).toString());
},
});
api.registerMethod("get_topics", {
cacheable: false,
async handler(): Promise<string[]> {
return [...api.swarm.peers.keys()];
},
});
},
};
export default plugin;

7
src/plugins/index.ts Normal file
View File

@ -0,0 +1,7 @@
import core from "./core";
import rpc from "./rpc";
import dht from "./dht";
const corePlugins = [core, dht, rpc];
export default corePlugins;

102
src/plugins/rpc.ts Normal file
View File

@ -0,0 +1,102 @@
import {
Plugin,
PluginAPI,
RPCBroadcastRequest,
RPCBroadcastResponse,
RPCRequest,
RPCResponse,
} from "@lumeweb/interface-relay";
import { getRpcByPeer } from "../modules/rpc";
import { get as getSwarm } from "../modules/swarm";
import b4a from "b4a";
import pTimeout, { ClearablePromise } from "p-timeout";
let api: PluginAPI;
async function broadcastRequest(
request: RPCRequest,
relays: string[],
timeout = 5000
): Promise<Map<string, Promise<any>>> {
const makeRequest = async (relay: string) => {
const rpc = await getRpcByPeer(relay);
return rpc.request(`${request.module}.${request.method}`, request.data);
};
let relayMap = new Map<string, ClearablePromise<any>>();
for (const relay of relays) {
let req;
if (b4a.equals(b4a.from(relay, "hex"), getSwarm().keyPair.publicKey)) {
req = api.rpcServer.handleRequest(request);
} else {
req = makeRequest(relay);
}
let timeoutPromise = pTimeout(req, {
milliseconds: timeout,
message: `relay timed out after ${timeout} milliseconds`,
});
relayMap.set(relay, timeoutPromise);
}
await Promise.allSettled([...relays.values()]);
return relayMap;
}
const plugin: Plugin = {
name: "rpc",
async plugin(_api: PluginAPI): Promise<void> {
api = _api;
api.registerMethod("broadcast_request", {
cacheable: false,
async handler(req: RPCBroadcastRequest): Promise<RPCBroadcastResponse> {
if (!req?.request) {
throw new Error("request required");
}
if (!req?.request?.module) {
throw new Error("request.module required");
}
if (!req?.request?.method) {
throw new Error("request.method required");
}
if (!req?.relays?.length) {
throw new Error("relays required");
}
if (
req?.request?.module === "rpc" &&
req?.request?.method === "broadcast_request"
) {
throw new Error("recursive broadcast_request calls are not allowed");
}
let resp = await broadcastRequest(req.request, req.relays, req.timeout);
const result: RPCBroadcastResponse = {
relays: {},
data: true,
signedField: "relays",
};
for (const relay of resp.keys()) {
let ret: RPCResponse | Error;
try {
ret = await resp.get(relay);
if (ret instanceof Error) {
result.relays[relay] = { error: ret.message };
} else {
result.relays[relay] = ret as RPCResponse;
}
} catch (e: any) {
result.relays[relay] = { error: e.message };
}
}
return result;
},
});
},
};
export default plugin;

3940
yarn.lock

File diff suppressed because it is too large Load Diff