Infura Availability

Hi,
I work on a data ingestion project, but we’re getting a lot of 504 Gateway Timeouts and the Dashboard has been down for a while, yet the status page of infura says everything’s operational.

Is there an incident going on? What’s going on?

Status page: https://status.infura.io/

Georgi,

Thanks for reaching out! Did your 504 errors start today? There was only a small eth2 beacon chain outage on June 24th, but all services should be operational now. Are there any other logs, screenshots, code snippets or errors you have come across? What are the exact steps you are executing, for reproduction on our end?

Kind regards,
Alex | Infura Support

I can’t attach photos here, but its still going on. As well as the dashboard outage.

Georgi,

Services are operational on our end. Can we please check your DNS settings? Which operating system are you running on your computer? I suspect it could be DNS issues.

Kind regards,
Alex | Infura Support

Alex,

I still cannot access the dashboard and I am still getting 504s when requesting things from the node. Since I can’t upload images here, I’m sending a link with what I see on my side. Images

The images illustrate that DNS resolution on my end is successful, as well as on my server, since its a timeout error not a Host name unresolved error. Also you can see that I’m getting a lot of timeout error (2.6k/24hours) and I only started measuring 3 days ago, because of the issue.

Let me know if I can provide more info.

Timing out URLs ( Assuming this is related to specific blocks/slots):

  • /eth/v1/beacon/states/4139232/validators
  • /eth/v1/beacon/states/4140288/validators
  • /eth/v1/beacon/states/4140256/validators
  • /eth/v1/beacon/states/4136384/validators
  • /eth/v2/beacon/blocks/4138564
  • /eth/v2/beacon/blocks/4140234
  • /eth/v1/beacon/states/4139712/validators

Georgi,

The Dev team is looking into the nodes now. We are currently implementing a fix at the moment, but no word of a date or time yet. I will update you with more information as I get it.

Kind regards,
Alex | Infura Support

Hi,

Any update? I am still getting a ton of timeouts.

  File "/opt/python3.8/lib/python3.8/site-packages/requests/models.py", line 960, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: https://***:***@eth2-beacon-mainnet.infura.io/eth/v1/beacon/states/4188928/validators

During handling of the above exception, another exception occurred:

Regards

@the-veloper The nodes have been resynced I ran some tests this morning, but all my requests are getting through. Any chance you can restart your script/service and give it another try? Can we test using a general block header or validator request to see if you still encounter the 502 error?

Kind regards,
Alex | Infura Support

Hi,

Looks like the issue is still going on. ( Timezone is UTC)

  File "/home/airflow/gcs/dags/ethereum2etl/api/request.py", line 28, in make_get_request
    response.raise_for_status()
  File "/opt/python3.6/lib/python3.6/site-packages/requests/models.py", line 943, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: https://***:***@eth2-beacon-mainnet.infura.io/eth/v1/beacon/states/4143328/validators
[2022-07-07 06:19:21,946] {taskinstance.py:1196} INFO - Marking task as FAILED. dag_id=eth2_mainnet_hourly_export_dag, task_id=export_beacon_validators, execution_date=20220629T233000, start_date=20220707T060914, end_date=20220707T061921
[2022-07-07 06:19:22,355] {sendgrid.py:116} INFO - Email with subject Airflow alert: <TaskInstance: eth2_mainnet_hourly_export_dag.export_beacon_validators 2022-06-29T23:30:00+00:00 [failed]> is successfully sent to recipients: [{'to': [{'email': '<CENSORED>'}]}]
[2022-07-07 06:19:24,068] {local_task_job.py:102} INFO - Task exited with return code 1

Let me know if I can provide more info.

To everyone that is facing the same issue:

This is not related to any node provider, this is just poor design by eth2.

cpu going wild: