* feat(providers): tokio 1.0
BREAKING: This removes async-std as a compatibility option
* feat: tokio 1.0 in rest of crates
* fix: patch Cargo.toml until deps are released
* fix(contract): load ws deps
* feat: bytes 1.0 (#121)
* feat(core): move to bytes::Bytes
* feat: adjust rest of crates to Bytes
* chore: bump deps
CI fails due to:
https://github.com/snapview/tokio-tungstenite/pull/142#discussion_r550445144
* chore: use latest tokio-tungstenite
* ci: split tests into jobs (#129)
* Switch to `hex` (#128)
* fix(core): replace rustc_hex with hex
* fix(providers): replace rustc_hex with hex
* chore: replace rustc-hex with hex
* chore: cargo fmt
* fix(ledger): copy address from string correctly
* chore: fix flaky tests
Fixes#105
* feat: allow encoding/decoding function data
* feat: allow decoding event data
* feat: human readable abi
inspired from https://blog.ricmoo.com/human-readable-contract-abis-in-ethers-js-141902f4d917
* test: add event / fn decoding tests
* chore: fix clippy
* feat(abigen): allow providing args in human readable format
* Implement Multicall functionality for batched calls
* Documentation, some modifications as suggested in the review
* (Abigen) handle single input arg and set output irrespective of mutability
* implement send functionality and allow clearing calls
* Fix detokenization, dont require pre-processing anymore
* panic when more than supported number of calls are pushed
* add doc for panics in case of add_call
* (multicall) eth_balance support, update bindings
* refactor: move multicall to its own directory
* fix: add infura api key
* ci: ensure CI runs on PRs from forks
* test(multicall): re-use aggregate call
* contract: make multicall docs compile and remove redundant clones
* ci: add public etherscan API key so that forks don't get rate limited
* chore: adjust test contract naming
Co-authored-by: Georgios Konstantopoulos <me@gakonst.com>
* test(tokens): ensure nested tuples tokenize properly
* test(contract): add failing test with 2 args
This happens because the args are serialized as Token::Tuple, while instead they should be just a vector
* fix(tokens): add token flattening method to fix non-nested tuples
* fix: do not export the flatten function