getLogs error "query returned more than 1000 results"

im using getLogs in web3 python to get around using eth.filters so i can use Infura as my provider. https://web3py.readthedocs.io/en/stable/web3.eth.html?highlight=getlogs#web3.eth.Eth.getLogs

Last week i was able to use this function with no issue but just started consistently getting this error message today {‘code’: -32005, ‘message’: ‘query returned more than 1000 results’}

When i use my local node, most of the time my getLogs returns <10 items, nowhere close to 1000 results.
Anything change recently that may be causing this issue?

example of getLogs response (“Transfer” events from WETH) when i run my own node.

[
  {
    'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2',
    'blockHash': HexBytes('0xe09e45baae928f281035d170f8b91e84809059fde26aee80cea72d3b485d72d2'),
    'blockNumber': 7271526,
    'data': '0x000000000000000000000000000000000000000000000000001115fcc7b8d900',
    'logIndex': 61,
    'removed': False,
    'topics': [
      HexBytes('0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'),
      HexBytes('0x00000000000000000000000041a7f96431595e4310b0b5719beba5f729b38af3'),
      HexBytes('0x000000000000000000000000c95af34ba523562dc0510f3a7034c2bb37328989')
    ],
    'transactionHash': HexBytes('0x46fb9a219911c6c63f3259461ee6b46e2cc37d41e8535780d3f115fc0765d780'),
    'transactionIndex': 76,
    'transactionLogIndex': '0x1',
    'type': 'mined'
  },
  {
    'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2',
    'blockHash': HexBytes('0xe09e45baae928f281035d170f8b91e84809059fde26aee80cea72d3b485d72d2'),
    'blockNumber': 7271526,
    'data': '0x000000000000000000000000000000000000000000000000000d98832d377c00',
    'logIndex': 63,
    'removed': False,
    'topics': [
      HexBytes('0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'),
      HexBytes('0x00000000000000000000000041a7f96431595e4310b0b5719beba5f729b38af3'),
      HexBytes('0x00000000000000000000000049497a4d914ae91d34ce80030fe620687bf333fd')
    ],
    'transactionHash': HexBytes('0x46fb9a219911c6c63f3259461ee6b46e2cc37d41e8535780d3f115fc0765d780'),
    'transactionIndex': 76,
    'transactionLogIndex': '0x3',
    'type': 'mined'
  },
  {
    'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2',
    'blockHash': HexBytes('0xe09e45baae928f281035d170f8b91e84809059fde26aee80cea72d3b485d72d2'),
    'blockNumber': 7271526,
    'data': '0x000000000000000000000000000000000000000000000000744bbd99d1c60000',
    'logIndex': 76,
    'removed': False,
    'topics': [
      HexBytes('0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'),
      HexBytes('0x00000000000000000000000039755357759ce0d7f32dc8dc45414cca409ae24e'),
      HexBytes('0x000000000000000000000000c3d73a4f23373d9334fe007828026af363936f18')
    ],
    'transactionHash': HexBytes('0xaaa1129520294dc64061342ed3818f0e11c2760da8d99c7b784cf6181a91141e'),
    'transactionIndex': 86,
    'transactionLogIndex': '0x0',
    'type': 'mined'
  }
]

Hey thanks for reaching out, can you provide a code snippet for how you’re hitting this and the full error response?

from web3.utils.contracts import find_matching_event_abi
from web3.utils.events import get_event_data
from web3.utils.filters import construct_event_filter_params

async def poll_erc20_logs_loop(self):
    ev_loop: asyncio.BaseEventLoop = self._ev_loop
    while True:
        try:
            transfer_tasks = []
            approval_tasks = []
            for address in self._contract_addresses:
                _, transfer_event_filter_params = construct_event_filter_params(
                    find_matching_event_abi(abi, event_name=TRANSFER_EVENT_NAME),
                    address=address
                )
                transfer_tasks.append(ev_loop.run_in_executor(
                    wings.get_executor(),
                    functools.partial(self._w3.eth.getLogs, transfer_event_filter_params)
                ))
                _, approval_event_filter_params = construct_event_filter_params(
                    find_matching_event_abi(abi, event_name=APPROVAL_EVENT_NAME),
                    address=address
                )
                approval_tasks.append(ev_loop.run_in_executor(
                    wings.get_executor(),
                    functools.partial(self._w3.eth.getLogs, approval_event_filter_params)
                ))

            raw_transfer_logs = await asyncio.gather(*transfer_tasks)
            raw_approval_logs = await asyncio.gather(*approval_tasks)
            transfer_logs = list(cytoolz.concat(raw_transfer_logs))
            approval_logs = list(cytoolz.concat(raw_approval_logs))

            print('transfer_logs', transfer_logs)

This is a sample response i get when using my Alchemy node (https://alchemyapi.io/)

[
  {
    'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2',
    'blockHash': HexBytes('0x7b13c85b02bd4f9ed04bfbca9d3e8861ae40c0e70d8ac4ec652a19652ac7a5fb'),
    'blockNumber': 7271571,
    'data': '0x0000000000000000000000000000000000000000000000008ac7230489e80000',
    'logIndex': 86,
    'removed': False,
    'topics': [
      HexBytes('0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'),
      HexBytes('0x000000000000000000000000c6093fd9cc143f9f058938868b2df2daf9a91d28'),
      HexBytes('0x00000000000000000000000039755357759ce0d7f32dc8dc45414cca409ae24e')
    ],
    'transactionHash': HexBytes('0x99f59eef449c4ef72dd2dc049ba6ccb8df88cda4d56259cb86e5e8ad579de8bd'),
    'transactionIndex': 46,
    'transactionLogIndex': '0x0',
    'type': 'mined'
  },
  {
    'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2',
    'blockHash': HexBytes('0x7b13c85b02bd4f9ed04bfbca9d3e8861ae40c0e70d8ac4ec652a19652ac7a5fb'),
    'blockNumber': 7271571,
    'data': '0x0000000000000000000000000000000000000000000000000107003c93b09eb0',
    'logIndex': 111,
    'removed': False,
    'topics': [
      HexBytes('0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'),
      HexBytes('0x000000000000000000000000c3be16c9755aadbadf4065a3e684077217425d69'),
      HexBytes('0x000000000000000000000000cea8390009de7abb231db66e1a19b8de5e3f04cc')
    ],
    'transactionHash': HexBytes('0xdb6145aa39a7a251a6e1ab70d710e7316f4598a71beca756055167a20a993fa5'),
    'transactionIndex': 65,
    'transactionLogIndex': '0x1',
    'type': 'mined'
  },
  {
    'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2',
    'blockHash': HexBytes('0x7b13c85b02bd4f9ed04bfbca9d3e8861ae40c0e70d8ac4ec652a19652ac7a5fb'),
    'blockNumber': 7271571,
    'data': '0x000000000000000000000000000000000000000000000000000e9494db84d740',
    'logIndex': 113,
    'removed': False,
    'topics': [
      HexBytes('0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'),
      HexBytes('0x000000000000000000000000c3be16c9755aadbadf4065a3e684077217425d69'),
      HexBytes('0x00000000000000000000000049497a4d914ae91d34ce80030fe620687bf333fd')
    ],
    'transactionHash': HexBytes('0xdb6145aa39a7a251a6e1ab70d710e7316f4598a71beca756055167a20a993fa5'),
    'transactionIndex': 65,
    'transactionLogIndex': '0x3',
    'type': 'mined'
  }
]

Are you able to verify that that error is coming from Infura? Neither our application code, nor go-ethereum nodes return that error response. That may be something coming back from the other API you are using.

I just tested it again and it seems to be working now, might be an intermittent issue :frowning:

I’m pretty sure it was an issue with Infura because the only thing i changed was my provider url

# test_web3_provider_list = ["http://eth-mainnet.alchemyapi.io/jsonrpc/{ALCHEMY_PRIVATE_KEY}"]
test_web3_provider_list = ["https://mainnet.infura.io/v3/{INFURA_PRIVATE_KEY}"] 

Earlier my event watcher worked with Alchemy and threw the error when i used Infura but both seem to work now.

Hi! I’m facing the same error response. Here is how to reproduce it:

curl --data '{"method":"eth_getLogs","params":[{"fromBlock":"0x62da56","toBlock":"0x74e491","address":"0x89d24a6b4ccb1b6faa2625fe562bdd9a23260359"}],"id":1,"jsonrpc":"2.0"}' -H "Content-Type: application/json" -X POST https://mainnet.infura.io/v3/...

Response:

{“jsonrpc”:“2.0”,“id”:1,“error”:{“code”:-32005,“message”:“query returned more than 1000 results”}}

We currently limit response to blocks of 1000 responses, https://infura.io/docs/ethereum/json-rpc/eth_getLogs, the limitations are outlined in this doc.

Is there a way to get the blocknumber that causes the error, so we can optmise the next batch. It would be great if Infura can do the batching for us.

Hi @mhoangvslev . The recommended approach is to check for the "code": -32005 code in the response and re-query with a smaller block range (e.g. from a 10k block range -> a 5k block range). There isn’t a specific number of blocks that causes the error as it is dependent on the specifics of the query.

The Ethereum JSON-RPC standard lacks typical paginated or cursor based responses which would allow a client to step through a large number of query results. Lacking those standardized features, developers should emulate that behavior on the application side.

1 Like

Thank you for your answer. For now, I keep reduce the upperbound by half until the batch is small enough and continue, but it requires a lot of time. Cache is a solution to this where you retrieve the current blockheight and update accordingly. I hope there is a more efficient solution.

@egalano there seem to be some single blocks with more then 1000 logs, making it impossible to fetch their logs:

Post the following to Infura’s mainnet endpoint:

{
    "jsonrpc": "2.0",
    "method": "eth_getLogs",
    "params": [
        {
            "fromBlock": "0x5EEA68",
            "toBlock": "0x5EEA68",
            "topics": [
                [
                    "0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
                    "0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
                    "0x17307eab39ab6107e8899845ad3d59bd9653f200f220920489ca2b5937696c31",
                    "0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
                    "0x7fcf532c15f0a6db0bd6d0e038bea71d30d808c7d98cb3bf7268a95bf5081b65",
                    "0x0bcc4c97732e47d9946f229edb95f5b6323f601300e4690de719993f3c371129",
                    "0xdc47b3613d9fe400085f6dbdc99453462279057e6207385042827ed6b1a62cf7",
                    "0x82af639571738f4ebd4268fb0363d8957ebe1bbb9e78dba5ebd69eed39b154f0"
                ]
            ]
        }
    ],
    "id": 1
}

It will return:

{
    "jsonrpc": "2.0",
    "id": 1,
    "error": {
        "code": -32005,
        "message": "query returned more than 1000 results"
    }
}

How should we deal with this?

Edit: Some additional blocks with over 1000 logs: 0x5EEAC1, 0x5EEA6D.

Thanks for the info @fabio_0x. Let me talk with our engineering team and see what options we have.

1 Like

Thanks @fabio_0x, we’ve pushed a fix to up the limit to 10,000. I need to watch our infrastructure and see if we can support that max query see for all situations, but will make sure that no matter what you can always get a single block, even if it passes the query limit.

Thanks a lot Ryan! I really appreciate the speed at which this issue was addressed! :pray:

Hello,
Is the limit still raised to 10,000? I’m trying to query a single block which has more than 1,000 logs and am only getting back 1,000.
Thanks!
Billy

Yes, the limit is 10,000 @okwme can you provide an example block number or hash where this is occurring?

Thanks for the reply @Ryan_Schneider!

I’ve gotten a bit to the bottom of it. I was overwriting my toBlock with "latest". I’m using ethers.js who is supposed to swap through providers when errors occur. Probably ethers.js received the over 10k error from you and then tried etherscan or something which might have a 1000 results max but no error. When I try it against your API directly I see that even that mistakenly large query returns the 10k error.

So all good now and thanks again!

Hi,
can you please share some insight into how you approach if you want to fetch let’s say 100k event logs, so how would you acheive it. I have tried slicing the block ranges upto 1000 times and looping over them and merge the results, but still I get 10000 limit error in some ranges.
Any help would be appreciated.