Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support China domain in lambda cloudwatch logs url #1051

Merged
merged 4 commits into from
Jun 11, 2021

Conversation

charlesverdad
Copy link
Contributor

The AWSLambdaIntegration adds the cloudwatch URL in the context but the domain is hardcoded and doesn't work for China regions.

@untitaker
Copy link
Member

Is that the only region where the domain is different?

@charlesverdad
Copy link
Contributor Author

Is that the only region where the domain is different?

Ah good question. It looks like AWS has three partitions:

aws - aws.amazon.com
aws-cn - amazonaws.cn
aws-us-gov - amazonaws-us-gov.com

I don't think we should worry about the aws-us-gov partition. WDYT?

@untitaker
Copy link
Member

ok this makes sense. there are some failing testcases and lint jobs. Could you check them locally? CONTRIBUTING.md should contain enough info to get started.

@charlesverdad
Copy link
Contributor Author

charlesverdad commented Apr 16, 2021

@untitaker I'm not sure why the tests are failing for python >= 3.7 for my PR, afaik my changes should not have affected this.


tests/integrations/spark/test_spark.py FF.......                         [100%]

=================================== FAILURES ===================================
___________________________ test_set_app_properties ____________________________
tests/integrations/spark/test_spark.py:25: in test_set_app_properties
    spark_context = SparkContext(appName="Testing123")
.tox/py3.7-spark/lib/python3.7/site-packages/pyspark/context.py:136: in __init__
    conf, jsc, profiler_cls)
.tox/py3.7-spark/lib/python3.7/site-packages/pyspark/context.py:198: in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
.tox/py3.7-spark/lib/python3.7/site-packages/pyspark/context.py:306: in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
.tox/py3.7-spark/lib/python3.7/site-packages/py4j/java_gateway.py:1525: in __call__
    answer, self._gateway_client, None, self._fqn)
.tox/py3.7-spark/lib/python3.7/site-packages/py4j/protocol.py:328: in get_return_value
    format(target_id, ".", name), value)
E   py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
E   : java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
E   	at sun.nio.ch.Net.bind0(Native Method)
E   	at sun.nio.ch.Net.bind(Net.java:461)
E   	at sun.nio.ch.Net.bind(Net.java:453)
E   	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:222)
E   	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
E   	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
E   	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
E   	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
E   	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
E   	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
E   	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
E   	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
E   	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
E   	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
E   	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
E   	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
E   	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
E   	at java.lang.Thread.run(Thread.java:748)
----------------------------- Captured stderr call -----------------------------
21/04/15 06:57:51 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/04/15 06:57:52 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.

and

tests/integrations/django/asgi/test_asgi.py .......

=================================== FAILURES ===================================
_______________________ test_transaction_with_class_view _______________________
tests/integrations/django/test_basic.py:121: in test_transaction_with_class_view
    assert (
E   AssertionError: assert 'tests.integr...<locals>.view' == 'tests.integr...lassBasedView'
E     - tests.integrations.django.myapp.views.ClassBasedView
E     + tests.integrations.django.myapp.views.View.as_view.<locals>.view

@ahmedetefy ahmedetefy merged commit e204e1a into getsentry:master Jun 11, 2021
@ahmedetefy
Copy link
Contributor

Thanks for the PR!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants