You can get what you want; games, movie,software, template, applications, e-book and others in this blog

Saturday, May 9, 2009

Facebook History

History
http://en.wikipedia.org/wiki/FaceBook

The advent of Facebook came about as a spin-off of a Harvard University version of Hot or Not called Facemash.[9] Mark Zuckerberg, while attending Harvard as a sophomore, concocted Facemash on October 28, 2003. Zuckerberg was blogging about a girl and trying to think of something to do to get her off his mind.[10]


The Facebook on February 12, 2004According to The Harvard Crimson, Facemash "used photos compiled from the online facebooks of nine Houses, placing two next to each other at a time and asking users to choose the “hotter” person." To accomplish this, Zuckerberg hacked into the protected areas of Harvard's computer network and copied the house's private dormitory ID images. "Perhaps Harvard will squelch it for legal reasons without realizing its value as a venture that could possibly be expanded to other schools (maybe even ones with good-looking people ... )," Zuckerberg wrote in his personal blog. "But one thing is certain, and it’s that I’m a jerk for making this site. Oh well. Someone had to do it eventually ... "[11] The site was quickly forwarded to several campus group list-servers but was shut down a few days later by the Harvard administration. Zuckerberg was charged by the administration with breach of security, violating copyrights and violating individual privacy and faced expulsion, but ultimately the charges were dropped.[12] The following semester, Zuckerberg founded "The Facebook", originally located at thefacebook.com, on February 4, 2004.[13] “Everyone’s been talking a lot about a universal face book within Harvard,” Zuckerberg told The Harvard Crimson. “I think it’s kind of silly that it would take the University a couple of years to get around to it. I can do it better than they can, and I can do it in a week.”[14] Membership was initially restricted to students of Harvard College, and within the first month, more than half the undergraduate population at Harvard was registered on the service.[15] Eduardo Saverin (business aspects), Dustin Moskovitz (programmer), Andrew McCollum (graphic artist), and Chris Hughes soon joined Zuckerberg to help promote the website. In March 2004, Facebook expanded to Stanford, Columbia, and Yale.[16] This expansion continued when it opened to all Ivy League and Boston area schools, and gradually most universities in Canada and the United States.[17] In June 2004, Facebook moved its base of operations to Palo Alto, California.[16] The company dropped The from its name after purchasing the domain name facebook.com in 2005 for $200,000.[18] Facebook launched a high school version in September 2005, which Zuckerberg called the next logical step.[19] At that time, high school networks required an invitation to join.[20] Facebook later expanded membership eligibility to employees of several companies, including Apple Inc. and Microsoft.[21] Facebook was then opened on September 26, 2006 to everyone of ages 13 and older with a valid e-mail address.[22][23] In October 2008, Facebook announced that it was to set up its international headquarters in Dublin, Ireland.[24]

UNCOVERING THE INTELLECTUAL CORE OF THE INFORMATION SYSTEMS DISCIPLINE

By Anna Sidorova, Nicholas Evangelopoulos,Joseph S. Valacich, Thiagarajan Ramakrishnan (MIS Quarterly,2008)


This study uses latent semantic analysis to examine a
large body of published IS research in order to address this
question. Specifically, the abstracts of all research papers
over the time period from 1985 through 2006 published in
three top IS research journals—MIS Quarterly, Information
Systems Research, and Journal of Management Information
Systems—were analyzed. This analysis identified five core
research areas: (1) information technology and organizations;
(2) IS development; (3) IT and individuals; (4) IT and
markets; and (5) IT and groups. Over the time frame of our
analysis, these core topics have remained quite stable.
However, the specific research themes within each core area
have evolved significantly, reflecting research that has
focused less on technology development and more on the
social context in which information technologies are designed
and used. As such, this analysis demonstrates that the
information systems academic discipline has maintained a
relatively stable research identity that focuses on how IT
systems are developed and how individuals, groups,
organizations, and markets interact with IT.

Tuesday, January 13, 2009

Meta Analysis

Meta-Analysis of Correlations
By. Dr. Andy Field

Meta analysis is a statistical technique by which information from independent studies is assimilated. Traditionally, social science literatures were assimilated through discursive reviews. However, such reviews are subjective and prone to reviewer-biases such as the selective inclusion studies, selective weighting of certain studies, and misrepresentation of findings (see Wolf, 1986). The inability of the human mind to provide accurate, unbiased, reliable and valid summaries of research created the need to develop more objective methods. Meta-analysis arguably provides the first step to such objectivity (see Schmidt, 1992), although it too relies on subjective judgments regarding study inclusion (and so is still problematic because of biased selections of studies, and the omission of unpublished data-the file drawer problem). since the seminal contributions of glass (1976), Hedges and Olkin (1985), Rosenthal and Rubin (1978) and Hunter, Schmidt and Jackson (1982) there has been a meteoric increase in the use of meta analysis. Field (2001) reports that over 2200 published articles using or discussing meta-analysis were published between 1981 and 2000. of these, over 1400 have been published since 1995 and over 400 in the past year. Clearly, the use of meta-analysis is still accelerating.

Basic Principles

To summarise, an effect-size refers to the magnitude of effect observed in a study, be that the size of a relationship between variables or the degree of difference between group means. There are many different metrics that can be used to measure effect size: the Person product moment correlation coefficient, r; the effect-size index, d; as well as odds ratios, risk rates, and risk differences. of these the correlation coefficient is used most often (Law, Schmidt, & Hunter, 1994) and so is the focus of this study. Although various theorists have proposed variations on these metrics (for example, Galss's delta, cohen's d, and Hedges's g are all estimates of Thou), conceptually each metric represents the same thing: a standardized from of the size of the observed effect. Whether correlation coefficients or measures of differences are calculated is irrelevant because either metric can be converted into the other, and statistical analysis procedures for different metrics differ only in how the standard errors and bias correlations are calculated (Hedges, 1992).

In meta-analysis, the basic principle is to calculate effect sizes for individual studies, convert them to a common metric, and then combine them to obtain an average effect size. Studies in a meta-analysis are typically weighted by the accuracy of the effect size they provide (i.e. the sampling precision), which is achieved by using the sample size (or a function of it) as a weight. Once the mean effect size has been calculated it can be expressed in terms of standard normal deviations ( a Z score) by dividing by the standard error of the mean. A significance value (i.e. the probability, p, of obtaining a Z score of such magnitude by chance) can then be computed. Alternatively, the significance of the evarage effect size can be inferred from the boundaries of a confidence interval constructed around the mean effect size.

Johnson, Mullen and Salas (1995) point out that meta-analysis is typically used to address three general issues: central tendency, variability and prediction. Central tendency relates to the need to find the expected maginitude of effect across many studies (from which the population effect size can be inferred). this need is met by using some variation on the average effect size, the sinificance of the this average or the confidence interval around the average. The issue of variability pertains to difference between effect sizes acrosss studies and is generally adressed using tests of the homogenity of effect sizes. The question of prediction relates to the need to explain the variability in effect size across studies in terms of moderator variables. This issue is usually addressed by comparing study outcomes as a function of differences in characteristics that vary over all studies. As an example, differences in effect sizes could be moderated by the fact that some studies were carried out in the USA whereas others were conducted in the UK.

Saturday, January 10, 2009

My GIS Project

Peta Penggunaan Lahan



Note: This is just an example (in jpg format)

Saturday, January 3, 2009

GIS

Introduction to ArcGIS

By Kardi Teknomo, PhD.

GIS is abbreviation of Geographic Information system. Aside from buzzwords and much hype about it, GIS in a very simple term is just a combination of Maps and database. In other words, GIS is a spatial database .

Map is a very useful geographical tool to show the location of places. You may have learned about the world map with many countries in it.

You may have recognized city map with district boundary and road and railway as shown below.

For those of you who do not even know what a database is, you can imagine a database as relational tables .

Database is more than a simple table because in it you can do some query (search back for some data) and make relationship between one table and the other tables.

GIS is a combination of Maps and Database. Depending on how we represent the map, every element in the map may have an association or connection with some database table. For example, selecting a country in the world map that has an association with a table will show all the attributes data related to that selection.

Maps in GIS can be separated into several layers. Each layer map can be viewed in combination with some other layers, or it can be seen as a separate layer. You may extract some information based on these layers. For instance, you may search for some location based on several criteria, or find some pattern over several years map, query features that near to some other feature, and so on. Later in this series of GIS tutorial, you may learn further query, analysis, and modeling and enhance decision support system using the power of GIS.

My choice of ArcGIS (I am using version 8, version 9 may be slightly different in menu) is simply because its availability and my own familiarity with this software. Even if you do not have ArcGIS, you may browse this tutorial to gain understanding on what you can do using GIS. I also provide a list of other GIS software both commercial and free version in the GIS Resources page.

ArcGIS is a product of ESRI. The latest version is available in three different editions under three different names:
-ArcView (simple, Standard edition)
-ArcEditor (functional, Professional edition)
-ArcInfo (advance; Enterprise edition)

Do not be confused with the name of the editions. The three editions above share the same architecture and have the same way on how to use. The differences are prices and more tools/functions.

If you open the menu in ArcGIS, you may see many icons. The first question is, perhaps, which one to use?
Regardless your editions, ArcGIS will contain at least three main softwares
ArcCatalog,
ArcMap, and
ArcToolbox

ArcCatalog is similar to Windows Explorer. With ArcCatalog you may preview your GIS data in term of geography (map) or table. If your data contain 3D scene, you can use ArcCatalog to preview the content. You can create file, folder and meta-data of your GIS data using this software. If your data is a map file, you can double click to open it in ArcMap, otherwise, you can do drag and drop of your GIS data into ArcMap.

ArcMap is the main software that you use for GIS. Using ArcMap you can create and edit your map, layers, and charts. You may perform some analysis and query on your GIS data using ArcMap.

ArcToolbox is a collection of Analysis Tools (e.g. extract, overlay, proximity, statistics, surface cut and fill, volume and contour), Conversion Tools (i.e. import to / export from different type of files) and Data and Management Tools (e.g. Join tables, define projections, etc.)

If you ArcGIS contain ArcScene, it is useful for showing your map in 3 dimension. ArcScene is similar to ArcMap with more functional in 3D scene analysis, but you cannot create map in ArcScene.

The rest of this tutorial will cover only ArcCatalog and ArcMap since it is the main software of ArcGIS.

Monday, December 22, 2008

Build an Executive Information System

Microsoft Office Excel 2007 Visual Basic for Applications Step by Step
by Reed Jacobson
Microsoft Press 2007 (384 pages)
ISBN:9780735624023

Experience learning made easy, and quickly teach yourself Microsoft Office Excel 2007 VBA--one step at a time! With help from this friendly book, you'll learn to automate spreadsheets, write functions and procedures, customize menus, toolbars, and more!
Table of Contents
Microsoft Office Excel 2007 Visual Basic for Applications Step by Step
Features and Conventions of This Book
Using the Book’s CD
Getting Help
Chapter 1 - Make a Macro Do Simple Tasks
Chapter 2 - Make a Macro Do Complex Tasks
Chapter 3 - Explore Workbooks and Worksheets
Chapter 4 - Explore Range Objects
Chapter 5 - Explore Data Objects
Chapter 6 - Explore Graphical Objects
Chapter 7 - Control Visual Basic
Chapter 8 - Extend Excel and Visual Basic
Chapter 9 - Launch Macros with Events
Chapter 10 - Use Dialog Box Controls on a Worksheet
Chapter 11 - Create a Custom Form
Appendix A - Complete Enterprise Information System
Index
List of Figures
List of Sidebars
CD Content

This is an interesting book he..he...Build your executive information system by visual basic in excel, Now.

Tuesday, December 16, 2008

Simulation and Monte Carlo: With applications in finance and MCMC

Simulation and Monte Carlo: With applications in finance and MCMC
By. LSC forum
Paperback: 348 pages
Data: March 23, 2007
Format: PDF

Description: Simulation and Monte Carlo is aimed at students studying for degrees in Mathematics, Statistics, Financial Mathematics, Operational Research, Computer Science, and allied subjects, who wish an up-to-date account of the theory and practice of Simulation. Its distinguishing features are in-depth accounts of the theory of Simulation, including the important topic of variance reduction techniques, together with illustrative applications in Financial Mathematics, Markov chain Monte Carlo, and Discrete Event Simulation.

Each chapter contains a good selection of exercises and solutions with an accompanying appendix comprising a Maple worksheet containing simulation procedures. The worksheets can also be downloaded from the web site supporting the book. This encourages readers to adopt a hands-on approach in the effective design of simulation experiments.

Arising from a course taught at Edinburgh University over several years, the book will also appeal to practitioners working in the finance industry, statistics and operations research.

Click Download

FREE keyword suggestion tool

Enter a starting keyword to generate up to 100 related keywords and an estimate of internet user daily search volume. Now...have the result. More information, click this link free keyword suggestion tool from Wordtracker

My Blog List

Have a nice day

The term "Sistem Informasi Keperilakuan" is firstly pointed and popularated by Jogiyanto Hartono Mustakini in his book "Sistem Informasi Keperilakuan" (2007), Publisher Andi Offset Yogyakarta Indonesia.

Please...give us any suggestions, critics, and whatever that are relevant to this blog for improving its quality..thanks

Note:For the best appearance, use opera or IE as browser

Alfitman; Pamenan Mato Nan Hilang; Ikhlas Hati

About Me

Padang, West Sumatra, Indonesia
I wish I can do my best in human's life