'Spark process doesn't finish when publishing to RabbitMQ

At the end of my Spark app (Containerized Spark v2.4.7), I publish a message to RabbitMQ. The app runs successfully and even the message is published to my containerized RabbitMQ. The problem is that the process doesn't really finish... I need to ctrl c from the terminal to abort the process.

Loglines that I write to console after publishing message are written, which means the process didn't stuck in the RabbitMQ client.

I tried to close the RabbitMQ channel and connection, but it didn't help. Also, I tried to close the SparkSession at the end, but it didn't help either.

I wrote a test with Junit where I create the same queue with the same configuration and write the same message with the same client configuration and it finishes successfully, not stuck or anything.

My RabbitMQ publishing implementation:

private Connection getConnection() throws KeyManagementException, NoSuchAlgorithmException, URISyntaxException, IOException, TimeoutException{
    ConnectionFactory factory = new ConnectionFactory();
    String uri = getURI();
    factory.setUri(new URI(uri));
    factory.setConnectionTimeout(10000);
    return factory.newConnection();
}

@Override
public void publish(String msg) {
    try{
        Connection connection = getConnection();
        Channel channel = connection.createChannel();
        channel.queueDeclare("rabbitQueue", true, false, false, null);
        channel.basicPublish("exchange", "rabbitKey", null, msg.getBytes());
    } catch (Exception ex){
        logger.error("Error, failed to create connection to RabbitMQ", ex);
    }
}

What should be the reason that my process doesn't finish? Thanks!



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source