Thursday, July 27, 2017
mysql server on linux
1. Checking the status of mysql server
Open the terminal and run the below command to check the status of the mysql server
2. Starting mysql server
Run the below command to start the mysql server
Open the terminal and run the below command to check the status of the mysql server
2. Starting mysql server
Run the below command to start the mysql server
3. Setting a password for mysql root user
To set the password for linux root user run the below command
4. Connecting to the mysql server as a root user
5. Granting privileges to mysql server to connect remotely
6. Binding ip address and port no for mysql server
Under /etc modify my.cnf file as below
Steps for clearing the history in linux terminal
1.
Command to check the history
Open a terminal and type the command "history" and click enter
It will give the history of all commands which we used
2. Command to clear the history
To clear this history in the terminal type the command "history -c" and click enter
3. Now again check for history it will show nothing
Friday, July 14, 2017
Starting and stopping Apache hadoop and yarn
Starting Hadoop
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Start hadoop using the command start-dfs.sh command
Now check the started nodes by using jps command as below

Check the nodes information by this url http://hostname:50070
Starting Yarn
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Start yarn using the command start-yarn.sh command
Now check the started nodes by using jps command as below
Check the nodes information by this url http://hostname:8088
Stopping yarn
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Stop yarn using the command stop-yarn.sh command

Stopping hadoop
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Stop hadoop using the command stop-dfs.sh command
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Start hadoop using the command start-dfs.sh command
Now check the started nodes by using jps command as below
Check the nodes information by this url http://hostname:50070
Starting Yarn
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Start yarn using the command start-yarn.sh command
Now check the started nodes by using jps command as below
Check the nodes information by this url http://hostname:8088
Stopping yarn
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Stop yarn using the command stop-yarn.sh command
Stopping hadoop
Login to server through putty
Change to hadoop user
Navigate to HADOOP_HOME/sbin directory
Stop hadoop using the command stop-dfs.sh command
Cloudera Apache Hadoop Web UI Urls
Name Node url
http://hostname:50070
All Applications Hadoop
http://hostname:8088/cluster
Hive web url
http://hostname:10002/hiveserver2.jsp
Hue web url
http://hostname:8888
Oozie web console
http://hostname:11000/oozie
Spark history server
http://hostname:18088
Yarn job history server
http://hostname:19888/jobhistory
http://hostname:50070
All Applications Hadoop
http://hostname:8088/cluster
Hive web url
http://hostname:10002/hiveserver2.jsp
Hue web url
http://hostname:8888
Oozie web console
http://hostname:11000/oozie
Spark history server
http://hostname:18088
Yarn job history server
http://hostname:19888/jobhistory
Sunday, July 2, 2017
OBIEE and OBIA Version History
OBIEE
based on Siebel Business Analytics which was first developed by nQuire in the
year 1997
Please find the major version releases
below :
2001: nQuire acquired by Siebel
Siebel Analytics 7.0 – 2002
Siebel Analytics 7.5 – 2003
Siebel Analytics 7.7 – 2004
Siebel Analytics 7.8.2 and 7.8.3 –
2005
Siebel Analytics / Oracle Business
Intelligence 7.8.4 and 7.8.5 – 2006
Please find the OBIEE versions releases below :
Please find the OBIA versions releases below :
Subscribe to:
Posts (Atom)
Opatch reports 'Cyclic Dependency Detected' error when patching ODI Issue: When applying a Patch Set Update (PSU) to WebLogic Se...
-
1. Creating a directory in hdfs $ hdfs dfs -mkdir <paths> 2. List the directories in hdfs $ hdfs dfs -l...
-
After configuring SSL with Custom Identity and Trust Keystores and when we try to start the servers getting the below error ...
-
Getting the below error when installing oracle database 12c on linux machine Soft Limit: maximum stack size - This is a prerequisite con...