Sunday, March 25, 2018
java.lang.OutOfMemoryError: Java heap space, Running OOM killer script for process 11420 for Solr on port 8983, hortonworks
In Hortonworks cluster environment Ambari Infra Solr service failed to start with below error:
Resolution:
To resolve this
1. Login to Ambari management console.
2. Goto Ambari Infra
3. Click on Configs
4. Under settings tab increase the Infra Solr Minimum and Maximum heap size and click on save.
5. Start the service now.
# java.lang.OutOfMemoryError: Java heap space
#
-XX:OnOutOfMemoryError="/usr/lib/ambari-infra-solr/bin/oom_solr.sh 8983
/var/log/ambari-infra-solr"
# Executing /bin/sh
-c "/usr/lib/ambari-infra-solr/bin/oom_solr.sh 8983 /var/log/ambari-infra-solr"...
Running OOM killer script for process 11420 for Solr on port
8983
Killed process 11420
Resolution:
To resolve this
1. Login to Ambari management console.
2. Goto Ambari Infra
3. Click on Configs
4. Under settings tab increase the Infra Solr Minimum and Maximum heap size and click on save.
5. Start the service now.
Sunday, March 11, 2018
Error: Could not contact Elasticsearch at http://localhost:9200. Please ensure that Elasticsearch is reachable from your system
Error: Could not contact
Elasticsearch at http://172.16.10.53:9200. Please ensure that Elasticsearch is
reachable from your system
Getting above error when we open Kibana console
Resolution:
Close and save the file
Restart the elasticsearch and reopen kibana console
Getting above error when we open Kibana console
Resolution:
Navigate to $ELASTICSEARCH_HOME/config
Open elasticsearch.yml and add lines below
script.disable_dynamic: true
http.cors.enabled: true
http.cors.allow-origin: "/.*/"
Close and save the file
Restart the elasticsearch and reopen kibana console
Saturday, February 17, 2018
Step by step installing oracle stream analytics and configuration
Installing Oracle Stream Analysis
Click next
Select skip auto updates and click next
Provide installation location and click next
Click next
Click next
Click install
Click next
Click finish to complete the installation
Configuring the domain:
Before configuring the domain create a schema as below
Run the below command to start the domain configuration
Click next
Select create new osa domain with examples and click next
Provide user name and password and click next
Provide server name and port and click next
Provide key-store password and click next
Click next
Click next
Provide domain name and location and click next
Click done to finish the configuration
Installing the OSA-Spark Integration Component
The OSA-Spark integration component adds Oracle’s
Continuous Query Language (CQL) support, along with an OSA-specific
runtime environment, to the Spark framework to implement application
deployment.
This component is delivered as part of
the OSA server installation as a single JAR at OSA_HOME/oep/spark/lib/spark-osa.jar.
This JAR file must be copied to all worker nodes
and also to the OSA node. Ideally, you copy this file into
your Spark installation at SPARK_HOME/lib/spark-osa.jar
Setting Kafka server.properties
Configuring the OSA Domain for Spark
- Stop the OSA server (OSA domain).
- Create the Spark configuration folder in your domain as: OSA_DOMAIN/config/spark.
- Create the OSA configuration file in the Spark configuration folder as: OSA_DOMAIN/config/spark/osa.properties.
- Edit the OSA configuration file according your
Subscribe to:
Posts (Atom)
Opatch reports 'Cyclic Dependency Detected' error when patching ODI Issue: When applying a Patch Set Update (PSU) to WebLogic Se...
-
Getting the below error when installing oracle database 12c on linux machine Soft Limit: maximum stack size - This is a prerequisite con...
-
After configuring SSL with Custom Identity and Trust Keystores and when we try to start the servers getting the below error ...
-
1. Creating a directory in hdfs $ hdfs dfs -mkdir <paths> 2. List the directories in hdfs $ hdfs dfs -l...