Tuesday, October 31, 2017

Copying files between widows to linux from command prompt using ftp

Open the command prompt as an administrator on windows machine
Try to connect to linux machine using ftp and try to copy a file

Here we are getting the message as "Not connected"

To resolve this we need to modify "vsftpd.conf" on linux machine as below:

uncomment the following line below and save the file

Now restart the ftp server as below:

Now again try to connect from windows machine

Here we tried to connect as root user but it got failed.
By default ftp server was not open to linux as root user so try with normal user.

Copy a file from windows to linux using "put" command in ftp prompt

Now check for the file in linux machine

Basic ftp commands:

1. ls
List the contents of remote directory

2. cd
Change the remote working directory

3. dir
List contents of remote directory

4. get
To receive file from the linux machine to windows machine

Check for the file on windows machine

5. lcd
Change the local working directory on windows machine

6. mget
Getting multiple files from linux machine

Checking files on windows machine

7. mput
Sending multiple files to linux machine

Checking files on linux machine

8. rename
Renaming a file name

Checking for the renamed file

9. bye
Terminating the ftp session

10. close
Terminating the ftp connection

11. help
Displays the local help information

Basic hadoop commands

1. Creating a directory in hdfs

$ hdfs dfs -mkdir <paths>

2. List the directories in hdfs
$ hdfs dfs  -ls  <args>

3. Permissions

4. Putting a file in to hdfs file system
$ hdfs dfs -put <local-src> ... <HDFS_dest_path>

5. Space utilization in hdfs directory
$ hdfs dfs -du URI

6. Downloading files from hdfs to local file system
$ hdfs dfs -get <hdfs_src> <localdst>

7. Merging two files
We can merge two files in hdfs file system into a single file to the local file system as below
$ hdfs dfs -getmerge <src1> <src2> <localdst> [addnl]

8. Copying files or directories recursively
$ hdfs dfs -cp <src-url> <dest-url>

9. Help command
Use help command to access hadoop command manual

10. Seeing the contents of a file
$ hdfs dfs -cat <path[filename]>

11. Copying a file from source to destination in hdfs file system
$ hdfs dfs -cp <source> <dest>

12. Copy a file using copyFromLocal and copyToLocal
$ hdfs dfs -copyFromLocal <localsrc> URI

$ hdfs dfs -copyToLocal [-ignorecrc] [-crc] URI <localdst>

13. Move file from source to destination in hdfs file system
$ hdfs dfs -mv <src> <dest>

14. Remove a file or directory from hdfs file system
$ hdfs dfs -rm <arg>

15. Remove files recursively

16. To display last few lines of a file
$ hdfs dfs -tail <path[filename]>

17. Hadoop version

18. To check amount of space used by hdfs file system

19. To count the number of files and directories in hdfs

Monday, October 23, 2017

Linking Aadhar card number with PF account online

1.  Go to website

2. Under Online Services click on eKYC Portal

3. In the next window click on LINK UAN AADHAR

4. In the next window enter UAN number and click on generate OTP

5. Confirm the OTP sent to your registered mobile number and enter Aadhar number
and click on submit

6. In the next window click on proceed for OTP verification

7. Select declaration and click on generate OTP

8.  Enter OTP send to your registered mobile number and click on validate OTP

9.  If the details in  Aadhar and UAN are matched we will get success statement otherwise it will suggest us to update our details like below

10 Click on exit and update the details in UAN  as Aadhar and try again to link Aadhar with UAN

Perfect order of ODI export and import

  Smart Export / Import is the recommended way to achieve such a migration. This is the easiest way to make sure all the dependencies are ta...