[esw@bigdatamgr1 ~]$ cat .bash_profile
...
for i in ~/conf/*.sh ; do
if [ -r "$i" ] ; then
. "$i"
fi
done
[esw@bigdatamgr1 ~]$ ll conf/
total 4
-rwxr-xr-x 1 esw biadmin 292 Mar 24 20:48 reset-biginsights-env.sh
使用biadmin停掉原来的jobtracker-tasktracker。
123
[biadmin@bigdatamgr1 IHC]$ ssh `hdfs getconf -confKey mapreduce.jobtracker.address | sed 's/:.*//' ` "sudo -u mapred /data/opt/ibm/biginsights/IHC/sbin/hadoop-daemon.sh stop jobtracker"
[biadmin@bigdatamgr1 biginsights]$ for h in `cat hadoop-conf/slaves ` ; do ssh $h "sudo -u mapred /data/opt/ibm/biginsights/IHC/sbin/hadoop-daemon.sh stop tasktracker" ; done
[biadmin@bigdatamgr1 biginsights]$ bin/start.sh -h
Usage: start.sh <component>...
Start one or more BigInsights components. Start all components if 'all' is
specified. If a component is already started, this command does nothing to it.
For example:
start.sh all
- Starts all components.
start.sh hadoop zookeeper
- Starts hadoop and zookeeper daemons.
OPTIONS:
-ex=<component>
Exclude a component, often used together with 'all'. I.e.
`stop.sh all -ex=console` stops all components but the mgmt console.
-h, --help
Get help information.
反复依赖的包,通过软链来管理
1234567891011
[biadmin@bigdatamgr1 lib]$ ll
total 50336
-rw-r--r-- 1 biadmin biadmin 303042 Jan 30 15:22 avro-1.7.4.jar
lrwxrwxrwx 1 biadmin biadmin 60 Jan 30 15:22 biginsights-gpfs-2.2.0.jar -> /data/opt/ibm/biginsights/IHC/lib/biginsights-gpfs-2.2.0.jar
-rw-r--r-- 1 biadmin biadmin 15322 Jan 30 15:22 findbugs-annotations-1.3.9-1.jar
lrwxrwxrwx 1 biadmin biadmin 48 Jan 30 15:22 guardium-proxy.jar -> /data/opt/ibm/biginsights/lib/guardium-proxy.jar
-rw-r--r-- 1 biadmin biadmin 1795932 Jan 30 15:22 guava-12.0.1.jar
-rw-r--r-- 1 biadmin biadmin 710492 Jan 30 15:22 guice-3.0.jar
-rw-r--r-- 1 biadmin biadmin 65012 Jan 30 15:22 guice-servlet-3.0.jar
lrwxrwxrwx 1 biadmin biadmin 45 Jan 30 15:22 hadoop-core.jar -> /data/opt/ibm/biginsights/IHC/hadoop-core.jar
lrwxrwxrwx 1 biadmin biadmin 76 Jan 30 15:22 hadoop-distcp-2.2.0.jar -> /data/opt/ibm/biginsights/IHC/share/hadoop/tools/lib/hadoop-distcp-2.2.0.jar
自定义了InputFormat来UniformSizeInputFormat进行拆分构造FileSplit,对CONF_LABEL_LISTING_FILE_PATH文件的每个键值的文件大小平均分成Map num
个数小块,根据键值的位置构造Map num个FileSplit对象。执行map时,RecordReader根据FileSplit来获取键值对,然后传递给map。
CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message): Could NOT find ZLIB (missing: ZLIB_INCLUDE_DIR), 缺少zlib-devel。
main:
[echo] Running test_libhdfs_threaded
[exec] nmdCreate: NativeMiniDfsCluster#Builder#Builder error:
[exec] java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/MiniDFSCluster$Builder
[exec] Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.MiniDFSCluster$Builder
[exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
[exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
[exec] at java.security.AccessController.doPrivileged(Native Method)
[exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
[exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
[exec] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
[exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
[exec] TEST_ERROR: failed on /root/hadoop-2.6.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/libhdfs/test_libhdfs_threaded.c:326 (errno: 2): got NULL from tlhCluster
Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the,安装openssl-devel。
123456789101112131415161718192021222324
main:
[mkdir] Created dir: /root/hadoop-2.6.0-src/hadoop-tools/hadoop-pipes/target/native
[exec] -- The C compiler identification is GNU 4.4.7
[exec] -- The CXX compiler identification is GNU 4.4.7
[exec] -- Check for working C compiler: /usr/bin/cc
[exec] -- Check for working C compiler: /usr/bin/cc -- works
[exec] -- Detecting C compiler ABI info
[exec] -- Detecting C compiler ABI info - done
[exec] -- Check for working CXX compiler: /usr/bin/c++
[exec] -- Check for working CXX compiler: /usr/bin/c++ -- works
[exec] -- Detecting CXX compiler ABI info
[exec] -- Detecting CXX compiler ABI info - done
[exec] -- Configuring incomplete, errors occurred!
[exec] See also "/root/hadoop-2.6.0-src/hadoop-tools/hadoop-pipes/target/native/CMakeFiles/CMakeOutput.log".
[exec] CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:108 (message):
[exec] Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the
[exec] system variable OPENSSL_ROOT_DIR (missing: OPENSSL_LIBRARIES
[exec] OPENSSL_INCLUDE_DIR)
[exec] Call Stack (most recent call first):
[exec] /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE)
[exec] /usr/share/cmake/Modules/FindOpenSSL.cmake:313 (find_package_handle_standard_args)
[exec] CMakeLists.txt:20 (find_package)
[exec]
[exec]
cd /mnt
mkdir cdrom
mount /dev/cdrom cdrom
cd cdrom/
mkdir ~/vmware
tar zxvf VMwareTools-9.2.0-799703.tar.gz -C ~/vmware
cd ~/vmware
cd vmware-tools-distrib/
./vmware-install.pl
reboot
cd /mnt/hgfs/maven
当前的maven目录是映射到宿主的机器目录。
12345
[root@localhost maven]# ll -a
total 3
drwxrwxrwx. 1 root root 0 Dec 28 2012 .
dr-xr-xr-x. 1 root root 4192 Mar 7 22:41 ..
drwxrwxrwx. 1 root root 0 Dec 28 2012 .m2