Sunday, April 18, 2010

host

host computer (host)
1. A computer that is attached to a network and provides services other than simply acting as a store-and-forward processor or communication switch. Host computers range in size from small microcomputers to large time-sharing or batch mainframes. Many networks have a hierarchical structure, with a communication subnetwork providing packet-switching services for host computers to support time-sharing, remote job entry, etc. A host computer at one level of a hierarchy may function as a packet or message switch at another.
networking
2. A computer used to develop software for execution on another computer, known as the target computer.

3. A computer used to emulate another computer, known as the target computer. See also emulation.
computer science tutorial
seminar topic
newidea

about microsoft

Microsoft Corporation (NASDAQ: MSFT, HKEX: 4338) is a multinational computer technology corporation that develops, manufactures, licenses, and supports a wide range of software products for computing devices. Headquartered in Redmond, Washington, USA, its most profitable products are the Microsoft Windows operating system and the Microsoft Office suite of productivity software. As of the third quarter of 2009, Microsoft was ranked as the third largest company in the world, following PetroChina and ExxonMobil.
SIXTH SCENCE TECHNOLOGY
The company was founded on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800. Microsoft rose to dominate the home computer operating system market with MS-DOS in the mid-1980s, followed by the Windows line of operating systems. Many of its products have achieved near-ubiquity in the desktop computer market. One commentator notes that Microsoft's original mission was "a computer on every desk and in every home, running Microsoft software." Microsoft possesses footholds in other markets, with assets such as the MSNBC cable television network and the MSN Internet portal. The company also markets both computer hardware products such as the Microsoft mouse and the Microsoft Natural keyboard, as well as home entertainment products such as the Xbox, Xbox 360, Zune and MSN TV. The company's initial public stock offering (IPO) was in 1986; the ensuing rise of the company's stock price has made four billionaires and an estimated 12,000 millionaires from Microsoft employees.

Throughout its history the company has been the target of criticism, including monopolistic business practices and anti-competitive strategies including refusal to deal and tying. The U.S. Department of Justice and the European Commission, among others, have ruled against Microsoft for antitrust violations. (See also United States v. Microsoft, European Union Microsoft competition case.)
computer science tutorial
seminar topic
newidea

'c' language

The computing world has undergone a revolution since the publication of The C Programming Language in 1978. Big computers are much bigger, and personal computers have capabilities that rival mainframes of a decade ago. During this time, C has changed too, although only modestly, and it has spread far beyond its origins as the language of the UNIX operating system.
The growing popularity of C, the changes in the language over the years, and the creation of compilers by groups not involved in its design, combined to demonstrate a need for a more precise and more contemporary definition of the language than the first edition of this book provided. In 1983, the American National Standards Institute (ANSI) established a committee whose goal was to produce ``an unambiguous and machine-independent definition of the language C'', while still retaining its spirit. The result is the ANSI standard for C.
The standard formalizes constructions that were hinted but not described in the first edition, particularly structure assignment and enumerations. It provides a new form of function declaration that permits cross-checking of definition with use. It specifies a standard library, with an extensive set of functions for performing input and output, memory management, string manipulation, and similar tasks. It makes precise the behavior of features that were not spelled out in the original definition, and at the same time states explicitly which aspects of the language remain machine-dependent.
This Second Edition of The C Programming Language describes C as defined by the ANSI standard. Although we have noted the places where the language has evolved, we have chosen to write exclusively in the new form. For the most part, this makes no significant difference; the most visible change is the new form of function declaration and definition. Modern compilers already support most features of the standard.
We have tried to retain the brevity of the first edition. C is not a big language, and it is not well served by a big book. We have improved the exposition of critical features, such as pointers, that are central to C programming. We have refined the original examples, and have added new examples in several chapters. For instance, the treatment of complicated declarations is augmented by programs that convert declarations into words and vice versa. As before, all examples have been tested directly from the text, which is in machine-readable form.
shortcuts
Appendix A, the reference manual, is not the standard, but our attempt to convey the essentials of the standard in a smaller space. It is meant for easy comprehension by programmers, but not as a definition for compiler writers -- that role properly belongs to the standard itself. Appendix B is a summary of the facilities of the standard library. It too is meant for reference by programmers, not implementers. Appendix C is a concise summary of the changes from the original version.
As we said in the preface to the first edition, C ``wears well as one's experience with it grows''. With a decade more experience, we still feel that way. We hope that this book will help you learn C and use it well.
7
We are deeply indebted to friends who helped us to produce this second edition. Jon Bently, Doug Gwyn, Doug McIlroy, Peter Nelson, and Rob Pike gave us perceptive comments on almost every page of draft manuscripts. We are grateful for careful reading by Al Aho, Dennis Allison, Joe Campbell, G.R. Emlin, Karen Fortgang, Allen Holub, Andrew Hume, Dave Kristol, John Linderman, Dave Prosser, Gene Spafford, and Chris van Wyk. We also received helpful suggestions from Bill Cheswick, Mark Kernighan, Andy Koenig, Robin Lake, Tom London, Jim Reeds, Clovis Tondo, and Peter Weinberger. Dave Prosser answered many detailed questions about the ANSI standard. We used Bjarne Stroustrup's C++ translator extensively for local testing of our programs, and Dave Kristol provided us with an ANSI C compiler for final testing. Rich Drechsler helped greatly with typesetting.
computer science the complete reference
operating system concepts
computer virus working
computer science tutorial
seminar topic
newidea

Java

J2EE is yet another acronym in the world of computing. This one stands for Java 2 Platform, Enterprise Edition. Its significance will become clear once we trace its lineage. First of all, Java is a programming language developed by Sun Microsystems, one of the giants of the industry. The Java Platform is a virtual machine, a processor look-alike that translates computerized instructions into functions.

The Java language is such that it allows cross-platform communication between multiple kinds of devices. For example, a programmer can develop Java code on a desktop computer and expect it to run on other computers, routers, and even mobile phones, as long as those devices are Java-enabled. This portability is described by the Sun acronym WORA, which stands for "Write once, run anywhere." A large number of mainframes, computers, mobile phones, and other electronic devices operate using the Java Platform.

The 2 in the acronym J2EE stands for Version 2. As with many software applications, J2EE is Java Platform Version 2. Actually, the number 2 is often dropped nowadays, so J2EE becomes Java EE. Traditionally, though, it's still J2EE.

Now, on to the EE. It stands for Enterprise Edition, which is a powerful form of the Java Platform. Sun has created three editions so far. The most precise is the Micro Edition, which is used for mobile phones and PDAs. Following form, this can be abbreviated as Java ME.

The middle edition is the Standard Edition, which can run on mobile devices, laptops and desktop computers. The abbreviated name of this edition is Java SE. Building our way up the pyramid, we come at last to the Enterprise Edition, which includes all the functionality of the Micro Edition and the Standard Edition and also features routines and subroutines designed specifically for servers and mainframes.

One prime benefit of the J2EE, despite the assumption of such a powerful set of source code, is that it is available for free. You can download it right now from the Sun Microsystems website. Third-party open-source tools are available to help you as well, including Apache Tomcat and JBoss. Unless you are running your own multiple-workstation server system or mainframe, however, you are unlikely to encounter or have a need for J2EE. Still, it's good to know what such things stand for.
computer science tutorial
seminar topic
newidea

internet technology

computer science tutorial
seminar topic
newidea

animation

Animation is the rapid display of a sequence of images of 2-D or 3-D artwork or model positions in order to create an illusion of movement. It is an optical illusion of motion due to the phenomenon of persistence of vision, and can be created and demonstrated in a number of ways. The most common method of presenting animation is as a motion picture or video program, although several other forms of presenting animation also exist.
2D animation figures are created and/or edited on the computer using 2D bitmap graphics or created and edited using 2D vector graphics. This includes automated computerized versions of traditional animation techniques such as of tweening, morphing, onion skinning and interpolated rotoscoping.
3D animation are digitally modeled and manipulated by an animator. In order to manipulate a mesh, it is given a digital skeletal structure that can be used to control the mesh. This process is called rigging. Various other techniques can be applied, such as mathematical functions (ex. gravity, particle simulations), simulated fur or hair, effects such as fire and water and the use of Motion capture to name but a few, these techniques fall under the category of 3d dynamics. Many 3D animations are very believable and are commonly used as Visual effects for recent movies.
creat animation
computer science tutorial
seminar topic
newidea

wide area network

WAN - Wide Area Network
As the term implies, a WAN spans a large physical distance. The Internet is the largest WAN, spanning the Earth.

A WAN is a geographically-dispersed collection of LANs. A network device called a router connects LANs to a WAN. In IP networking, the router maintains both a LAN address and a WAN address.

A WAN differs from a LAN in several important ways. Most WANs (like the Internet) are not owned by any one organization but rather exist under collective or distributed ownership and management. WANs tend to use technology like ATM, Frame Relay and X.25 for connectivity over the longer distances.
LAN, WAN and Home Networking
Residences typically employ one LAN and connect to the Internet WAN via an Internet Service Provider (ISP) using a broadband modem. The ISP provides a WAN IP address to the modem, and all of the computers on the home network use LAN (so-called private) IP addresses. All computers on the home LAN can communicate directly with each other but must go through a central gateway, typically a broadband router, to reach the ISP.
wireless communication
internet history
computer science tutorial
seminar topic
newidea

data strutcher programming

/*Program of sorting using address calculation sort*/
computer science tutorial
seminar topic
newidea
#include<stdio.h>
#include<malloc.h>
#define MAX 20

struct node
{
int info ;
struct node *link;
};
struct node *head[10];
int n,i,arr[MAX];
main()
{

int i;
printf("Enter the number of elements in the list : ");
scanf("%d", &n);
for(i=0;i<n;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&arr[i]);
}/*End of for */

printf("Unsorted list is :\n");
for(i=0;i<n;i++)
printf("%d ",arr[i]);
printf("\n");
addr_sort();
printf("Sorted list is :\n");
for(i=0;i<n;i++)
printf("%d ",arr[i]);
printf("\n");
}/*End of main()*/

addr_sort()
{
int i,k,dig;
struct node *p;
int addr;
k=large_dig();
for(i=0;i<=9;i++)
head[i]=NULL;
for(i=0;i<n;i++)
{
addr=hash_fn( arr[i],k );
insert(arr[i],addr);
}

for(i=0; i<=9 ; i++)
{
printf("head(%d) -> ",i);
p=head[i];
while(p!=NULL)
{
printf("%d ",p->info);
p=p->link;
}
printf("\n");
}

/*Taking the elements of linked lists in array*/
i=0;
for(k=0;k<=9;k++)
{
p=head[k];
while(p!=NULL)
{
arr[i++]=p->info;
p=p->link;
}
}
}/*End of addr_sort()*/

/*Inserts the number in sorted linked list*/
insert(int num,int addr)
{
struct node *q,*tmp;
tmp= malloc(sizeof(struct node));
tmp->info=num;
/*list empty or item to be added in begining */
if(head[addr] == NULL || num < head[addr]->info)
{
tmp->link=head[addr];
head[addr]=tmp;
return;
}
else
{
q=head[addr];
while(q->link != NULL && q->link->info < num)
q=q->link;
tmp->link=q->link;
q->link=tmp;
}
}/*End of insert()*/

/* Finds number of digits in the largest element of the list */
int large_dig()
{

int large = 0,ndig = 0 ;

for(i=0;i<n;i++)
{
if(arr[i] > large)
large = arr[i];
}
printf("Largest Element is %d , ",large);
while(large != 0)
{
ndig++;
large = large/10 ;
}

printf("Number of digits in it are %d\n",ndig);
return(ndig);
} /*End of large_dig()*/

hash_fn(int number,int k)
{
/*Find kth digit of the number*/
int digit,addr,i;
for(i = 1 ; i <=k ; i++)
{
digit = number % 10 ;
number = number /10 ;
}
addr=digit;
return(addr);
}/*End of hash_fn()*/

/* Program of sorting using bubble sort */
#include <stdio.h>
#define MAX 20
main()
{
int arr[MAX],i,j,k,temp,n,xchanges;
printf("Enter the number of elements : ");
scanf("%d",&n);
for (i = 0; i < n; i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&arr[i]);
}
printf("Unsorted list is :\n");
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
/* Bubble sort*/

for (i = 0; i < n-1 ; i++)
{
xchanges=0;
for (j = 0; j <n-1-i; j++)
{
if (arr[j] > arr[j+1])
{
temp = arr[j];
arr[j] = arr[j+1];
arr[j+1] = temp;
xchanges++;
}/*End of if*/
}/*End of inner for loop*/

if(xchanges==0) /*If list is sorted*/
break;
printf("After Pass %d elements are : ",i+1);
for (k = 0; k < n; k++)
printf("%d ", arr[k]);
printf("\n");
}/*End of outer for loop*/

printf("Sorted list is :\n");
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
}/*End of main()*/

/* Program of sorting through heapsort*/
# include <stdio.h>

int arr[20],n;

main()
{
int i;
printf("Enter number of elements : ");
scanf("%d",&n);
for(i=0;i<n;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&arr[i]);
}
printf("Entered list is :\n");
display();

create_heap();

printf("Heap is :\n");
display();

heap_sort();
printf("Sorted list is :\n");
display();
}/*End of main()*/

display()
{ int i;
for(i=0;i<n;i++)
printf("%d ",arr[i]);
printf("\n");
}
/*End of display()*/

create_heap()
{
int i;
for(i=0;i<n;i++)
insert(arr[i],i);
}/*End of create_heap()*/

insert(int num,int loc)
{
int par;
while(loc>0)
{
par=(loc-1)/2;
if(num<=arr[par])
{
arr[loc]=num;
return;
}
arr[loc]=arr[par];
loc=par;
}/*End of while*/
arr[0]=num;
}/*End of insert()*/


heap_sort()
{
int last;
for(last=n-1; last>0; last--)
del_root(last);
}/*End of del_root*/

del_root(int last)
{
int left,right,i,temp;
i=0; /*Since every time we have to replace root with last*/
/*Exchange last element with the root */
temp=arr[i];
arr[i]=arr[last];
arr[last]=temp;

left=2*i+1; /*left child of root*/
right=2*i+2;/*right child of root*/

while( right < last)
{
if( arr[i]>=arr[left] && arr[i]>=arr[right] )
return;
if( arr[right]<=arr[left] )
{
temp=arr[i];
arr[i]=arr[left];
arr[left]=temp;
i=left;
}
else
{
temp=arr[i];
arr[i]=arr[right];
arr[right]=temp;
i=right;
}
left=2*i+1;
right=2*i+2;
}/*End of while*/
if( left==last-1 && arr[i]<arr[left] )/*right==last*/
{
temp=arr[i];
arr[i]=arr[left];
arr[left]=temp;
}
}/*End of del_root*/

/* Program of sorting using insertion sort */
#include <stdio.h>
#define MAX 20

main()
{
int arr[MAX],i,j,k,n;
printf("Enter the number of elements : ");
scanf("%d",&n);
for (i = 0; i < n; i++)
{
printf("Enter element %d : ",i+1);
scanf("%d", &arr[i]);
}
printf("Unsorted list is :\n");
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
/*Insertion sort*/
for(j=1;j<n;j++)
{
k=arr[j]; /*k is to be inserted at proper place*/
for(i=j-1;i>=0 && k<arr[i];i--)
arr[i+1]=arr[i];
arr[i+1]=k;
printf("Pass %d, Element inserted in proper place: %d\n",j,k);
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
}
printf("Sorted list is :\n");
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
}/*End of main()*/

/* Program of merging two sorted arrays into a third sorted array*/
#include<stdio.h>

main()
{
int arr1[20],arr2[20],arr3[40];
int i,j,k;
int max1,max2;

printf("Enter the number of elements in list1 : ");
scanf("%d",&max1);
printf("Take the elements in sorted order :\n");
for(i=0;i<max1;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&arr1[i]);
}
printf("Enter the number of elements in list2 : ");
scanf("%d",&max2);
printf("Take the elements in sorted order :\n");
for(i=0;i<max2;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&arr2[i]);
}
/* Merging */
i=0; /*Index for first array*/
j=0; /*Index for second array*/
k=0; /*Index for merged array*/

while( (i < max1) && (j < max2) )
{
if( arr1[i] < arr2[j] )
arr3[k++]=arr1[i++];
else
arr3[k++]=arr2[j++];
}/*End of while*/
/*Put remaining elements of arr1 into arr3*/
while( i < max1 )
arr3[k++]=arr1[i++];
/*Put remaining elements of arr2 into arr3*/
while( j < max2 )
arr3[k++]=arr2[j++];

/*Merging completed*/
printf("List 1 : ");
for(i=0;i<max1;i++)
printf("%d ",arr1[i]);
printf("\nList 2 : ");
for(i=0;i<max2;i++)
printf("%d ",arr2[i]);
printf("\nMerged list : ");
for(i=0;i<max1+max2;i++)
printf("%d ",arr3[i]);
printf("\n");
}/*End of main()*/

/*Program of sorting using quick sort through recursion*/
#include<stdio.h>
#define MAX 30

enum bool { FALSE,TRUE };
main()
{
int array[MAX],n,i;
printf("Enter the number of elements : ");
scanf("%d",&n);

for(i=0;i<n;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&array[i]);
}

printf("Unsorted list is :\n");
display(array,0,n-1);
printf("\n");

quick(array,0,n-1);

printf("Sorted list is :\n");
display(array,0,n-1);
printf("\n");

}/*End of main() */

quick(int arr[],int low,int up)
{
int piv,temp,left,right;
enum bool pivot_placed=FALSE;
left=low;
right=up;
piv=low; /*Take the first element of sublist as piv */

if(low>=up)
return;
printf("Sublist : ");
display(arr,low,up);

/*Loop till pivot is placed at proper place in the sublist*/
while(pivot_placed==FALSE)
{
/*Compare from right to left */
while( arr[piv]<=arr[right] && piv!=right )
right=right-1;
if( piv==right )
pivot_placed=TRUE;
if( arr[piv] > arr[right] )
{
temp=arr[piv];
arr[piv]=arr[right];
arr[right]=temp;
piv=right;
}
/*Compare from left to right */
while( arr[piv]>=arr[left] && left!=piv )
left=left+1;
if(piv==left)
pivot_placed=TRUE;
if( arr[piv] < arr[left] )
{
temp=arr[piv];
arr[piv]=arr[left];
arr[left]=temp;
piv=left;
}
}/*End of while */

printf("-> Pivot Placed is %d -> ",arr[piv]);
display(arr,low,up);
printf("\n");

quick(arr,low,piv-1);
quick(arr,piv+1,up);
}/*End of quick()*/
display(int arr[],int low,int up)
{
int i;
for(i=low;i<=up;i++)
printf("%d ",arr[i]);
}

/*Program of sorting using radix sort*/
# include<stdio.h>
# include<malloc.h>

struct node
{
int info ;
struct node *link;
}*start=NULL;

main()
{
struct node *tmp,*q;
int i,n,item;

printf("Enter the number of elements in the list : ");
scanf("%d", &n);

for(i=0;i<n;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&item);

/* Inserting elements in the linked list */
tmp= malloc(sizeof(struct node));
tmp->info=item;
tmp->link=NULL;

if(start==NULL) /* Inserting first element */
start=tmp;
else
{
q=start;
while(q->link!=NULL)
q=q->link;
q->link=tmp;
}
}/*End of for*/

printf("Unsorted list is :\n");
display();
radix_sort();
printf("Sorted list is :\n");
display ();
}/*End of main()*/

display()
{
struct node *p=start;
while( p !=NULL)
{
printf("%d ", p->info);
p= p->link;
}
printf("\n");
}/*End of display()*/

radix_sort()
{
int i,k,dig,maxdig,mindig,least_sig,most_sig;
struct node *p, *rear[10], *front[10];

least_sig=1;
most_sig=large_dig(start);

for(k = least_sig; k <= most_sig ; k++)
{
printf("PASS %d : Examining %dth digit from right ",k,k);
for(i = 0 ; i <= 9 ; i++)
{
rear[i] = NULL;
front[i] = NULL ;
}
maxdig=0;
mindig=9;
p = start ;
while( p != NULL)
{
/*Find kth digit in the number*/
dig = digit(p->info, k);
if(dig>maxdig)
maxdig=dig;
if(dig<mindig)
mindig=dig;

/*Add the number to queue of dig*/
if(front[dig] == NULL)
front[dig] = p ;
else
rear[dig]->link = p ;
rear[dig] = p ;
p=p->link;/*Go to next number in the list*/
}/*End while */
/* maxdig and mindig are the maximum amd minimum
digits of the kth digits of all the numbers*/

printf("mindig=%d maxdig=%d\n",mindig,maxdig);
/*Join all the queues to form the new linked list*/
start=front[mindig];
for(i=mindig;i<maxdig;i++)
{
if(rear[i+1]!=NULL)
rear[i]->link=front[i+1];
else
rear[i+1]=rear[i];
}
rear[maxdig]->link=NULL;
printf("New list : ");
display();
}/* End for */

}/*End of radix_sort*/

/* This function finds number of digits in the largest element of the list */
int large_dig()
{
struct node *p=start ;
int large = 0,ndig = 0 ;

while(p != NULL)
{
if(p ->info > large)
large = p->info;
p = p->link ;
}
printf("Largest Element is %d , ",large);
while(large != 0)
{
ndig++;
large = large/10 ;
}

printf("Number of digits in it are %d\n",ndig);
return(ndig);
} /*End of large_dig()*/

/*This function returns kth digit of a number*/
int digit(int number, int k)
{
int digit, i ;
for(i = 1 ; i <=k ; i++)
{
digit = number % 10 ;
number = number /10 ;
}
return(digit);
}/*End of digit()*/

/*Program of sorting using selection sort*/
#include <stdio.h>
#define MAX 20

main()
{
int arr[MAX], i,j,k,n,temp,smallest;
printf("Enter the number of elements : ");
scanf("%d",&n);
for (i = 0; i < n; i++)
{
printf("Enter element %d : ",i+1);
scanf("%d", &arr[i]);
}
printf("Unsorted list is : \n");
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
/*Selection sort*/
for(i = 0; i< n - 1 ; i++)
{
/*Find the smallest element*/
smallest = i;
for(k = i + 1; k < n ; k++)
{
if(arr[smallest] > arr[k])
smallest = k ;
}
if( i != smallest )
{
temp = arr [i];
arr[i] = arr[smallest];
arr[smallest] = temp ;
}
printf("After Pass %d elements are : ",i+1);
for (j = 0; j < n; j++)
printf("%d ", arr[j]);
printf("\n");
}/*End of for*/
printf("Sorted list is : \n");
for (i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
}/*End of main()*/

/* Program of sorting using merge sort without recursion*/
#include<stdio.h>
#define MAX 30

main()
{
int arr[MAX],temp[MAX],i,j,k,n,size,l1,h1,l2,h2;

printf("Enter the number of elements : ");
scanf("%d",&n);
for(i=0;i<n;i++)
{
printf("Enter element %d : ",i+1);
scanf("%d",&arr[i]);
}
printf("Unsorted list is : ");
for( i = 0 ; i<n ; i++)
printf("%d ", arr[i]);

/*l1 lower bound of first pair and so on*/
for(size=1; size < n; size=size*2 )
{
l1=0;
k=0; /*Index for temp array*/
while( l1+size < n)
{
h1=l1+size-1;
l2=h1+1;
h2=l2+size-1;
if( h2>=n ) /* h2 exceeds the limlt of arr */
h2=n-1;
/*Merge the two pairs with lower limits l1 and l2*/
i=l1;
j=l2;
while(i<=h1 && j<=h2 )
{
if( arr[i] <= arr[j] )
temp[k++]=arr[i++];
else
temp[k++]=arr[j++];
}
while(i<=h1)
temp[k++]=arr[i++];
while(j<=h2)
temp[k++]=arr[j++];
/**Merging completed**/
l1=h2+1; /*Take the next two pairs for merging */
}
/*End of while*/

for(i=l1; k<n; i++) /*any pair left */
temp[k++]=arr[i];

for(i=0;i<n;i++)
arr[i]=temp[i];

printf("\nSize=%d \nElements are : ",size);
for( i = 0 ; i<n ; i++)
printf("%d ", arr[i]);
}
/*End of for loop */
printf("Sorted list is :\n");
for( i = 0 ; i<n ; i++)
printf("%d ", arr[i]);
printf("\n");
}/*End of main()*/






motherboard

What is Motherboard?









Computer Parts - Mother Board
Motherboard

Motherboard
A motherboard is also known as a main board, system board and logic board. A common abbreviation is ‘mobo'. They can be found in a variety of electrical devices, ranging from a TV to a computer.

Generally, they will be referred to as a motherboard or a main board when associated with a complex device such as a computer, which is what we shall look at. Put simply, it is the central circuit board of your computer.

All other components and peripherals plug into it, and the job of the motherboard is to relay information between them all. Despite the fact that a better motherboard will not add to the speed of your PC, it is none-the-less important to have one that is both stable and reliable, as its role is vital.

A motherboard houses the BIOS (Basic Input/Output System), which is the simple software run by a computer when initially turned on. Other components attach directly to it, such as the memory, CPU (Central Processing Unit), graphics card, sound card, hard-drive, disk drives, along with various external ports and peripherals.

There are a lot of motherboards on the market to choose from. The big question is how do you go about choosing which one is right for you? Different motherboards support different components, and so it is vital you make a number of decisions concerning general system specifications before you can pick the right motherboard.

A motherboard can come in many configurations to fit different needs and budgets. At its most basic, it comes with several interfaces for necessary components and a BIOS chip set to control setup of the motherboard. Many computer enthusiasts favor one type of BIOS over another and will choose a motherboard partially based on the BIOS manufacturer.

An equally important feature of the motherboard is the type of CPU it will support. Some motherboards support AMD CPUs, while others support Intel processors. If you purchase your case before other components, the first factor to think about motherboard is the size, or form factor. A form factor is a standardized motherboard size.

If you think about fitting a motherboard in a case, there are numbers of mounting holes, slot locations and PSU connectors. The most popular motherboard form factor today is ATX, which evolved from it's predecessor, the Baby AT, a smaller version of the AT (Advanced Technology) form factor. Generally todays computers have ATX form factor motherboard.








//
//

Chipsets are a crucial part of a motherboard - they control the system and its’ capabilities. Furthermore, a chipset supports the facilities offered by the processor. A chipset is part of the motherboard, and cannot be upgraded without upgrading the whole board. It is therefore important to make sure you choose the right one for you in the first place.

There are a few main producers of chipsets, which are AMD, Intel, NVidia and Via: The latter two make chipsets for both AMD and Intel processors; AMD and Intel only make chipsets compatible with their own processors.

Another important consideration is the amount and type of RAM the motherboard will support. It is always best to buy a board that supports more RAM than currently needed. If new technology for RAM chips are available, getting a board that supports the newer chips will help future-proof the investment.

The number of PCI slots varies from motherboard to motherboard, as do other interfaces like the number of SATA ports, different RAID abilities, and USB and Firewire ports. As mentioned earlier, sound and video capability might be built-in, although enthusiasts generally prefer to disable internal video, sound and add superior third party cards.

Computer display is an important issue as there are many kinds of graphics cards available these days. Graphics cards vary from PCI or AGP. These days, PCI is ruling the market but one will find that AGP graphics cards are still in use.

One of the best things you can do when looking for a motherboard is to read lots of reviews. They will give you good information about how the board performs and what it is compatible with. Never make a judgement on one review alone and wherever possible ask for recommendations from other people.

Unless you have limitless resources, price is always a consideration when buying computer component. A motherboard usually takes up a fairly large part of any PC budget, so it requires careful consideration. It is worth bearing in mind that cheaper boards sometimes support only more expensive components. If this is the case, work out the total cost of buying the board and components as sometimes it may be worth spending a little more on a more expensive board. A cheap motherboard may be unreliable and more trouble than it is worth. A motherboard is one of those components where it pays to spend a little extra.

computer science tutorial
seminar topic
newidea

computer history

In 1943 development begins on the Electronic Numerical Integrator And Computer (ENIAC) in earnest at Penn State. Designed by John Mauchly and J. Presper Eckert of the Moore School, they get help from John von NeumannHoward Aiken in the late 1930's, the Mark I computes complex tables for the U.S. Navy. It uses a paper tape to store instructions and Aiken hires Grace Hopper("Amazing Grace") as one of three programmers working on the machine. Thomas J. Watson Sr. plays a pivotal role involving his company, IBM, in the machine's development. and others. In 1944, the Havard Mark I is introduced. Based on a series of proposals from

Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the relays, possibly causing the problem. From this day on, Hopper refers to fixing the system as "debugging". The same year Von Neumann proposes the concept of a "stored program" in a paper that is never officially published.

Work completes on ENIAC in 1946. Although only three years old the machine is woefully behind on technology, but the inventors opt to continue while working on a more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A later version eliminates this problem. To make the machine appear more impressive to reporters during its unveiling, a team member (possibly Eckert) puts translucent spheres(halved ping pong balls) over the lights. The US patent office will later recognize this as the first computer.

The next year scientists employed by Bell Labs complete work on the transistor (John Bardeen, Walter BrattainWilliam Shockley receive the Nobel Prize in Physics in 1956), and by 1948 teams around the world work on a "stored program" machine. The first, nicknamed "Baby", is a prototype of a much larger machine under construction in Britain and is shown in June 1948. and

The impetus over the next 5 years for advances in computers is mostly the government and military. UNIVAC, delivered in 1951 to the Census Bureau, results in a tremendous financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an employee of that company proposes "reuseable software," code segments that could be extracted and assembled according to instructions in a "higher level language." The concept of compiling is born. Hopper would revise this concept over the next twenty years and her ideas would become an integral part of all modern computers. CBS uses one of the 46 UNIVAC computers produced to predict the outcome of the 1952 Presidential Election. They do not air the prediction for 3 hours because they do not trust the machine.









Small portion of the IBM 701
Courtesy IBM

IBM introduces the 701 the following year. It is the first commercially successful computer. In 1956 FORTRAN is introduced(proposed 1954, it takes nearly 3 years to develop the compiler). Two additional languages, LISP and COBOL, are added in 1957 and 1958. Other early languages include ALGOL and BASIC. Although never widely used, ALGOL is the basis for many of today's languages.

With the introduction of Control Data's CDC1604 in 1958, the first transistor powered computer, a new age dawns. Brilliant scientist Seymour Cray heads the development team. This year integrated circuits are introduced by two men, Jack Kilby and John Noyce, working independently. The second network is developed at MIT. Over the next three years computers begin affecting the day-to-day lives of most Americans. The addition of MICR characters at the bottom of checks is common.

In 1961 Fairchild Semiconductor introduces the integrated circuit. Within ten years all computers use these instead of the transistor. Formally building sized computers are now room-sized, and are considerably more powerful. The following year the Atlas becomes operational, displaying many of the features that make today's systems so powerful including virtual memory, pipeline instruction execution and paging. Designed at the University of Manchester, some of the people who developed Colossus thirty years earlier make contributions.

On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main feature of this machine is business oriented...IBM guarantees the "upward compatibility" of the system, reducing the risk that a business would invest in outdated technology. Dartmouth College, where the first network was demonstrated 25 years earlier, moves to the forefront of the "computer age" with the introduction of TSS(Time Share System) a crude(by today's standards) networking system. It is the first Wide Area Network. In three years Randy Golden, President and Founder of Golden Ink, would begin working on this network.

Within a year MIT returns to the top of the intellectual computer community with the introduction of a greatly refined network that features shared resources and uses the first minicomputer(DEC's PDP-8) to manage telephone lines. Bell Labs and GE play major roles in its design.

In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its own operating system, UNIX. One of the many precursors to today's Internet, ARPANet, is quietly launched. Alan Keys, who will later become a designer for Apple, proposes the "personal computer." Also in 1969, unhappy with Fairchild Semiconductor, a group of technicians begin discussing forming their own company. This company, formed the next year, would be known as Intel. The movie Colossus:The Forbin Project has a supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first feature length movie with the word computer in the title. In 1971, Texas Instruments introduces the first "pocket calculator." It weighs 2.5 pounds.

With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little publicized judicial decision takes the patent for the computer away from Mauchly and Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for the first local area networks.

In 1975 the first personal computer is marketed in kit form. The Altair features 256 bytes of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next year Apple begins to market PC's, also in kit form. It includes a monitor and keyboard. The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with the first royal email message.

During the next few years the personal computer explodes on the American scene. Microsoft, Apple and many smaller PC related companies form (and some die). By 1977 stores begin to sell PC's. Continuing today, companies strive to reduce the size and price of PC's while increasing capacity. Entering the fray, IBM introduces it's PC in 1981(it's actually IBM's second attempt, but the first failed miserably). Time selects the computer as its Man of the Year in 1982. Tron, a computer-generated special effects extravaganza is released the same year.
computer science tutorial
seminar topic
newidea

see yourself

Computer technology keeps advancing at an amazing pace. Today's home computers have more memory, run faster and are relatively less expensive than computers from ten or even five years ago. Is there a way to measure how fast your computer is? You might think that you could tell simply from the processor clock frequency, but this doesn't tell the whole story. The clock frequency just tells how many operations per second the CPU can perform. In modern multi-tasking operating systems, the CPU's time is split between many programs running at once. Also, the central processor (CPU) sends data to and receives data from other subsystems on the computer (e.g., memory, disk drives, video display) which usually run at slower speeds. Modern processors are equipped with high-speed data buffers (called caches) to alleviate these bottlenecks. There are also many strategies for optimizing the order of operations in a program for greatest efficiency.

In this project you will use a Java applet (see Experimental Procedure, below) to measure how long it takes for the computer to perform arithmetic operations (addition, subtraction, multiplication and division) with various data types (integers, long integers, floating point numbers and double-precision floating point numbers).

Measuring how long an operation takes provides useful information, both for optimizing algorithm performance, and also as a "benchmark" comparison between two computers. However, you must keep in mind that with today's multi-tasking operating systems, measuring the execution time of any single process is difficult. The operating system splits CPU time between all of the programs that are running. No program has "exclusive" access to the CPU. Generally, CPU processing speed is fast enough so that you don't notice this because programs appear to be instantly responsive. Behind the scenes though, each program is getting a slice of CPU time, then waiting for its next turn before it can run again.

So it is important to remember that, due to multi-tasking, the processing times you measure with the applet below will not represent the actual CPU time required to perform an addition or subtraction. In order for the applet to give you a best estimate, keep the number of open applications to a minimum, and make sure that any open applications are not performing tasks that require lots of CPU time (e.g., printing files or downloading content from the Internet).
computer science tutorial
seminar topic
newidea

computer shortcuts

Accessibility Keyboard Shortcuts

1) Right SHIFT for eight seconds (Switch FilterKeys either on or off)

2) Left ALT+left SHIFT+PRINT SCREEN (Switch High Contrast either on or off)

3) Left ALT+left SHIFT+NUM LOCK (Switch the MouseKeys either on or off)

4) SHIFT five times (Switch the StickyKeys either on or off)

5) NUM LOCK for five seconds (Switch the ToggleKeys either on or off)
6) Windows Logo +U (Open Utility Manager)


Windows Explorer Keyboard Shortcuts

1) END (Display the bottom of the active window)
2) HOME (Display the top of the active window)
3) NUM LOCK+Asterisk sign (*) (Display all of the subfolders that are under the selected folder)
4) NUM LOCK+Plus sign (+) (Display the contents of the selected folder)
5) NUM LOCK+Minus sign (-) (Collapse the selected folder)
6) LEFT ARROW (Collapse the current selection if it is expanded, or select the parent folder)
7) RIGHT ARROW (Display the current selection if it is collapsed, or select the first subfolder)


Shortcut Keys for Character Map

After you double-click a character on the grid of characters, you can move through the grid by using the keyboard shortcuts:
1) RIGHT ARROW (Move to the right or to the beginning of the next line)
2) LEFT ARROW (Move to the left or to the end of the previous line)
3) UP ARROW (Move up one row)
4) DOWN ARROW (Move down one row)
5) PAGE UP (Move up one screen at a time)
6) PAGE DOWN (Move down one screen at a time)
7) HOME (Move to the beginning of the line)
8) END (Move to the end of the line)
9) CTRL+HOME (Move to the first character)
10) CTRL+END (Move to the last character)
11) SPACEBAR (Switch between Enlarged and Normal mode when a character is selected)


Microsoft Management Console (MMC) Main Window Keyboard Shortcuts

1) CTRL+O (Open a saved console)
2) CTRL+N (Open a new console)
3) CTRL+S (Save the open console)
4) CTRL+M (Add or remove a console item)
5) CTRL+W (Open a new window)
6) F5 key (Update the content of all console windows)
7) ALT+SPACEBAR (Display the MMC window menu)
8) ALT+F4 (Close the console)
9) ALT+A (Display the Action menu)
10) ALT+V (Display the View menu)
11) ALT+F (Display the File menu)
12) ALT+O (Display the Favorites menu)
13) MMC Console Window Keyboard Shortcuts
14) CTRL+P (Print the current page or active pane)
15) ALT+Minus sign (-) (Display the window menu for the active console window)
16) SHIFT+F10 (Display the Action shortcut menu for the selected item)
17) F1 key (Open the Help topic, if any, for the selected item)
18) F5 key (Update the content of all console windows)
19) CTRL+F10 (Maximize the active console window)
20) CTRL+F5 (Restore the active console window)
21) ALT+ENTER (Display the Properties dialog box, if any, for the selected item)
22) F2 key (Rename the selected item)
23) CTRL+F4 (Close the active console window. When a console has only one console window, this shortcut closes the console)


Remote Desktop Connection Navigation

1) CTRL+ALT+END (Open the m*cro$oft Windows NT Security dialog box)
2) ALT+PAGE UP (Switch between programs from left to right)
3) ALT+PAGE DOWN (Switch between programs from right to left)
4) ALT+INSERT (Cycle through the programs in most recently used order)
5) ALT+HOME (Display the Start menu)
6) CTRL+ALT+BREAK (Switch the client computer between a window and a full screen)
7) ALT+DELETE (Display the Windows menu)
8) CTRL+ALT+Minus sign (-) (Place a snapshot of the active window in the client on the Terminal server clipboard and provide the same functionality as pressing PRINT SCREEN on a local computer.)
9) CTRL+ALT+Plus sign (+) (Place a snapshot of the entire client window area on the Terminal server clipboard and provide the same functionality as pressing ALT+PRINT SCREEN on a local computer.)


Microsoft Internet Explorer Navigation

1) CTRL+B (Open the Organize Favorites dialog box)
2) CTRL+E (Open the Search bar)
3) CTRL+F (Start the Find utility)
4) CTRL+H (Open the History bar)
5) CTRL+I (Open the Favorites bar)
6) CTRL+L (Open the Open dialog box)
7) CTRL+N (Start another instance of the browser with the same Web address)
8) CTRL+O (Open the Open dialog box, the same as CTRL+L)
9) CTRL+P (Open the Print dialog box)
10) CTRL+R (Update the current Web page)
11) CTRL+W (Close the current window)
computer science tutorial
seminar topic
newidea

computer hardware





What are Computer Hardwares?

Hardware (computer) components, equipments involved in the function of a computer. Computer hardware consists of the components that can be physically handled. The function of these components is typically divided into three main categories: input, output, and storage.

Components in these categories connect to microprocessors, specifically, the computer's central processing unit (CPU), the electronic circuitry that provides the computational ability and control of the computer, via wires or circuitry called a bus.

Software, on the other hand, is the set of instructions a computer uses to manipulate data, such as a word-processing program or a video game. These programs are usually stored and transferred via the computer's hardware to and from the CPU.

Software also governs how the hardware is utilized; for example, how information is retrieved from a storage device. The interaction between the input and output hardware is controlled by software called the Basic Input Output System software (BIOS).

Although microprocessors are still technically considered to be hardware, portions of their function are also associated with computer software. Since microprocessors have both hardware and software aspects they are therefore often referred to as firmware.

There!

That's it. That's basic components of computer hardwares. You can review deeper dreakdowns for each categories by clicking on categories and subcategories on your right hand side. We subdivided each categories as follows.

Input Devices

Output Devices

Storage Devices

Hardware Connections
computer science tutorial
seminar topic
newidea

Data storage device

omputer Storage Devices

Computer Storage Devices

Storage Devices
Storage hardware provides permanent storage of information and programs for retrieval by the computer. The two main types of storage devices are disk drives and memory.

There are several types of disk drives: hard, floppy, magneto-optical, and compact.

Hard disk drive
Hard disk drives store information in magnetic particles embedded in a disk. Usually a permanent part of the computer, hard disk drives can store large amounts of information and retrieve that information very quickly.

Floppy disk drive
Floppy disk drives also store information in magnetic particles embedded in removable disks that may be floppy or rigid. Floppy disks store less information than a hard disk drive and retrieve the information at a much slower rate.






// <![CDATA[//
// <![CDATA[//

Magneto-optical disc drive
Magneto-optical disc drives store information on removable discs that are sensitive to both laser light and magnetic fields. They can typically store as much information as hard disks, but they have slightly slower retrieval speeds.

Compact disc drive
Compact disc drives store information on pits burned into the surface of a disc of reflective material. CD-ROMs can store about as much information as a hard drive but have a slower rate of information retrieval. A digital video disc (DVD) looks and works like a CD-ROM but can store more than 7 times as much information.

Memory
Memory refers to the computer chips that store information for quick retrieval by the CPU. Random access memory RAM is used to store the information and instructions that operate the computer's programs.

Typically, programs are transferred from storage on a disk drive to RAM. RAM is also known as volatile memory because the information within the computer chips is lost when power to the computer is turned off.

Read-only memory (ROM) contains critical information and software that must be permanently available for computer operation, such as the operating system that directs the computer's actions from start up to shut down. ROM is called nonvolatile memory because the memory chips do not lose their information when power to the computer is turned off.

Some devices serve more than one purpose. For example, floppy disks may also be used as input devices if they contain information to be used and processed by the computer user. In addition, they can be used as output devices if the user wants to store the results of computations on them.
computer science tutorial
seminar topic
newidea

computer input device

Computer Input Device

Computer Input Devices
computer science tutorial
seminar topic
newidea
Computer Input Devices
Input devices consists of external devices - that is, devices outside of the computer's CPU - that provide information and instructions to the computer.

Light pen
A light pen is a stylus with a light sensitive tip that is used to draw directly on a computer's video screen or to select information on the screen by pressing a clip in the light pen or by pressing the light pen against the surface of the screen. The pen contains light sensors that identify which portion of the screen it is passed over.

Mouse
A mouse is a pointing device designed to be gripped by one hand. It has a detection device (usually a ball) on the bottom that enables the user to control the motion of an on-screen pointer, or cursor, by moving the mouse on a flat surface. As the device moves across the surface, the cursor moves across the screen. To select items or choose commands on the screen, the user presses a button on the mouse.

Joystick
A joystick is a pointing device composed of a lever that moves in multiple directions to navigate a cursor or other graphical object on a computer screen.

Keyboard
A keyboard is a typewriter-like device that allows the user to type in text and commands to the computer. Some keyboards have special function keys or integrated pointing devices, such as a trackball or touch-sensitive regions that let the user's finger motions move an on-screen cursor.






//
//

Optical Scanner
An optical scanner uses light-sensing equipment to convert images such as a picture or text into electronic signals that can be manipulated by a computer. For example, a photograph can be scanned into a computer and then included in a text document created on that computer.

The two most common scanner types are the flatbed scanner, which is similar to an office photocopier, and the handheld scanner, which is passed manually across the image to be processed.

Microphone
A microphone is a device for converting sound into signals that can then be stored, manipulated, and played back by the computer.

voice recognition
A voice recognition module is a device that converts spoken words into information that the computer can recognize and process.

Modem
A modem, which stands for modulator-demodulator, is a device that connects a computer to a telephone line or cable television network and allows information to be transmitted to or received from another computer. Each computer that sends or receives information must be connected to a modem. The digital signal sent from one computer is converted by the modem into an analog signal, which is then transmitted by telephone lines or television cables to the receiving modem, which converts the signal back into a digital signal that the receiving computer can understand.

There are few different types of Modems including...
Analog: 56K (52K top speed) using telephone line.
DSL/ADSL: Top speed can reach as high as 5 Mb in residential using telephone line.
Cable: Top speed about 8 Mb using Cable.
FiberOptic: Can reach 30Mbps depend on your available provider.

Modem is input device as well as output device. So, we will place detaied articles about modems under "Networking" category.

history of computer

Part I

Webster's Dictionary defines "computer" as any programmable electronic device that can store, retrieve, and process data. The basic idea of computing develops in the 1200's when a Moslem cleric proposes solving problems with a series of written procedures.

As early as the 1640's mechanical calculators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first commercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Liebnitz in the 1670s the first true multiplying calculator appears in Germany shortly before the American Revolution.

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads (retrieves) the pattern and creates(process) the weave. Powered by water, this "machine" came 140 years before the development of the modern computer.









Ada Countess Lovelace
Ada Lovelace

Shortly after the first mass-produced calculator(1820), Charles Babbage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and record-keeper, his difference engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while professor of Mathematics at Cork University, writes An Investigation of the Laws of Thought(1854), and is generally recognized as the father of computer science.

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith of MIT, the system uses electric power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's IBM.

Just prior to the introduction of Hollerith's machine the first printing calculator is introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially successful printing calculator. Although hand-powered, Burroughs quickly introduces an electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a machine he calls the differential analyzer. Using a set of gears and shafts, much like Babbage, the machine can handle simple calculus problems, but accuracy is a problem.

The period from 1935 through 1952 gets murky with claims and counterclaims of who invents what and when. Part of the problem lies in the international situation that makes much of the research secret. Other problems include poor record-keeping, deception and lack of definition.

In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to handle the math involved in his profession. Shortly after completion, Zuse starts on a programmable electronic device which he completes in 1938.









John Vincent Atanasoff
Courtesy Jo Campbell
The Shore Journal

John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry assists. The "ABC" is designed to solve linear equations common in physics. It displays some early features of later computers including electronic calculations. He shows it to others in 1939 and leaves the patent application with attorneys for the school when he leaves for a job in Washington during World War II. Unimpressed, the school never files and ABC is cannibalized by students.









The Enigma
Courtesy U. S. Army

The Enigma, a complex mechanical encoder is used by the Germans and they believe it to be unbreakable. Several people involved, most notably Alan Turing, conceive machines to handle the problem, but none are technically feasible. Turing proposes a "Universal Machine" capable of "computing" any algorithm in 1937. That same year George Steblitz creates his Model K(itchen), a conglomeration of otherwise useless and leftover material, to solve complex calculations. He improves the design while working at Bell Labs and on September 11, 1940, Steblitz uses a teletype machine at Dartmouth College in New Hampshire to transmit a problem to his Complex Number Calculator in New York and receives the results. It is the first example of a network.

First in Poland, and later in Great Britain and the United States, the Enigma code is broken. Information gained by this shortens the war. To break the code, the British, led by Touring, build the Colossus Mark I. The existence of this machine is a closely guarded secret of the British Government until 1970. The United States Navy, aided to some extent by the British, builds a machine capable of breaking not only the German code but the Japanese code as well.
computer science tutorial
seminar topic
newidea

output device of computer

Computer Output Devices






Computer Output Devices

Output Device
Output hardware consists of external devices that transfer information from the computer's CPU to the computer user. A video display, or screen, converts information generated by the computer into visual information.






// <![CDATA[//
// <![CDATA[//

Display
Displays commonly take one of two forms: a video screen with a cathode ray tube (CRT) or a video screen with a liquid crystal display (LCD).

CRT
A CRT-based screen, or monitor, looks similar to a television set. Information from the CPU is displayed using a beam of electrons that scans a phosphorescent surface that emits light and creates images.

Flat Panel or LCD
An LCD-based screen displays visual information on a flatter and smaller screen than a CRT-based video monitor. LCDs are frequently used in laptop computers.

Printers
Printers take text and image from a computer and print them on paper. Dot-matrix printers use tiny wires to impact upon an inked ribbon to form characters. Laser printers employ beams of light to draw images on a drum that then picks up fine black particles called toner. The toner is fused to a page to produce an image. Inkjet printers fire droplets of ink onto a page to form characters and pictures.

Speakers
Ahh... Gamers can not ignore this category. However, we'll talk about that in later time.



// <![CDATA[//
// <![CDATA[//


computer science tutorial
seminar topic
newidea

computer genration

In the beginning ...
A generation refers to the state of improvement in the development of a product. This term is also used in the different advancements of computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it. As a result of the miniaturization, speed, power, and memory of computers has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play.
computer science tutorial
seminar topic
newidea
The First Generation: 1946-1958 (The Vacuum Tube Years)
The first generation computers were huge, slow, expensive, and often undependable. In 1946two Americans, Presper Eckert, and John Mauchly built the ENIAC electronic computer which used vacuum tubes instead of the mechanical switches of the Mark I. The ENIAC used thousands of vacuum tubes, which took up a lot of space and gave off a great deal of heat just like light bulbs do. The ENIAC led to other vacuum tube type computers like the EDVAC (Electronic Discrete Variable Automatic Computer) and the UNIVAC I (UNIVersal Automatic Computer).

The vacuum tube was an extremely important step in the advancement of computers. Vacuum tubes were invented the same time the light bulb was invented by Thomas Edison and worked very similar to light bulbs. It's purpose was to act like an amplifier and a switch. Without any moving parts, vacuum tubes could take very weak signals and make the signal stronger (amplify it). Vacuum tubes could also stop and start the flow of electricity instantly (switch). These two properties made the ENIAC computer possible.

The ENIAC gave off so much heat that they had to be cooled by gigantic air conditioners. However even with these huge coolers, vacuum tubes still overheated regularly. It was time for something new.
The Second Generation: 1959-1964 (The Era of the Transistor)
The transistor computer did not last as long as the vacuum tube computer lasted, but it was no less important in the advancement of computer technology. In 1947 three scientists, John Bardeen, William Shockley, and Walter Brattain working at AT&T's Bell Labs invented what would replace the vacuum tube forever. This invention was the transistor which functions like a vacuum tube in that it can be used to relay and switch electronic signals.

There were obvious differences between the transisitor and the vacuum tube. The transistor was faster, more reliable, smaller, and much cheaper to build than a vacuum tube. One transistor replaced the equivalent of 40 vacuum tubes. These transistors were made of solid material, some of which is silicon, an abundant element (second only to oxygen) found in beach sand and glass. Therefore they were very cheap to produce. Transistors were found to conduct electricity faster and better than vacuum tubes. They were also much smaller and gave off virtually no heat compared to vacuum tubes. Their use marked a new beginning for the computer. Without this invention, space travel in the 1960's would not have been possible. However, a new invention would even further advance our ability to use computers.
The Third Generation: 1965-1970 (Integrated Circuits - Miniaturizing the Computer)
Transistors were a tremendous breakthrough in advancing the computer. However no one could predict that thousands even now millions of transistors (circuits) could be compacted in such a small space. The integrated circuit, or as it is sometimes referred to as semiconductor chip, packs a huge number of transistors onto a single wafer of silicon. Robert Noyce of Fairchild Corporation and Jack Kilby of Texas Instruments independently discovered the amazing attributes of integrated circuits. Placing such large numbers of transistors on a single chip vastly increased the power of a single computer and lowered its cost considerably.

Since the invention of integrated circuits, the number of transistors that can be placed on a single chip has doubled every two years, shrinking both the size and cost of computers even further and further enhancing its power. Most electronic devices today use some form of integrated circuits placed on printed circuit boards-- thin pieces of bakelite or fiberglass that have electrical connections etched onto them -- sometimes called a mother board.

These third generation computers could carry out instructions in billionths of a second. The size of these machines dropped to the size of small file cabinets. Yet, the single biggest advancement in the computer era was yet to be discovered.
The Fourth Generation: 1971-Today (The Microprocessor)
This generation can be characterized by both the jump to monolithic integrated circuits(millions of transistors put onto one integrated circuit chip) and the invention of the microprocessor (a single chip that could do all the processing of a full-scale computer). By putting millions of transistors onto one single chip more calculation and faster speeds could be reached by computers. Because electricity travels about a foot in a billionth of a second, the smaller the distance the greater the speed of computers.

However what really triggered the tremendous growth of computers and its significant impact on our lives is the invention of the microprocessor. Ted Hoff, employed by Intel (Robert Noyce's new company) invented a chip the size of a pencil eraser that could do all the computing and logic work of a computer. The microprocessor was made to be used in calculators, not computers. It led, however, to the invention of personal computers, or microcomputers.

It wasn't until the 1970's that people began buying computer for personal use. One of the earliest personal computers was the Altair 8800 computer kit. In 1975 you could purchase this kit and put it together to make your own personal computer. In 1977 the Apple II was sold to the public and in 1981 IBM entered the PC (personal computer) market.

Today we have all heard of Intel and its Pentium® Processors and now we know how it all got started. The computers of the next generation will have millions upon millions of transistors on one chip and will perform over a billion calculations in a single second. There is no end in sight for the computer movement.

linux commands











accept
access
aclocal
aconnect
acpi
acpi_available
acpid
addr2line
addresses
agetty
alsactl
amidi
amixer
anacron
aplay
aplaymidi
apm
apmd
apropos
apt
ar
arch
arecord
arecordmidi
arp
as
at
atd
atq
atrm
audiosend
aumix
autoconf
autoheader
automake
autoreconf
autoscan
autoupdate
b

badblocks
banner
basename
bash
batch
bc
biff
bison
bzcmp
bzdiff
bzgrep
bzip2
bzless
bzmore
c

c++
c++filt
cal
cardctl
cardmgr
cat
cc
cdda2wav
cdparanoia
cdrdao
cdrecord
cfdisk
chage
chattr
chfn
chgrp
chkconfig
chmod
chown
chpasswd
chroot
chrt
chsh
chvt
cksum
clear
cmp
col
colcrt
colrm
column
comm
compress
cp
cpio
cpp
crond
crontab
csplit
ctags
cupsd
cut
cvs
d

date
dd
deallocvt
debugfs
depmod
devdump
df
diff
diff3
dig
dir
dircolors
dirname
disable
dlpsh
dmesg
dnsdomainname
dnssec-keygen
dnssec-makekeyset
dnssec-signkey


dnssec-signzone
doexec
domainname
dosfsck
du
dump
dumpe2fs
dumpkeys
e

e2fsck
e2image
e2label
echo
edquota
egrep
eject
elvtune
emacs
enable
env
envsubst
esd
esd-config
esdcat
esdctl
esddsp
esdmon
esdplay
esdrec
esdsample
etags
ex
expand
expr
f

factor
false
fc-cache
fc-list
fdformat
fdisk
fetchmail
fgconsole
fgrep
file
find
finger
fingerd
flex
fmt
fold
formail
free
fsck
ftp
ftpd
fuser
g

g++
gawk
gcc
gdb
getent
getkeycodes
gpasswd
gpg
gpgsplit
gpgv
gpm
gprof
grep
groff
groffer
groupadd
groupdel
groupmod
groups
grpck
grpconv
gs
gunzip
gzexe
gzip
h

halt
hdparm
head
hexdump
host
hostid
hostname
htdigest
hwclock
i

iconv
id
ifconfig
imapd
inetd
info
init
insmod
install
ipcrm
ipcs
iptables
iptables-restore
iptables-save
isodump
isoinfo
isosize
isovfy
ispell
j

join
k

kbd_mode
kbdrate
kernelversion
kill
killall
killall5
klogd
kudzu
l

last
lastb
lastlog
ld
ldconfig
ldd
less
lesskey
lftp
lftpget
link
ln
loadkeys
locale
locate
lockfile
logger
login
logname
logrotate
look
losetup
lpadmin
lpinfo
lpmove
lpq
lpr
lprm
lpstat
ls
lsattr
lspci
lsmod
lsusb
m

m4
mail
mailq
mailstats
mailto
make
makedbm
makemap
man
manpath
mattrib
mbadblocks
mcat
mcd
mcopy
md5sum
mdel, mdeltree
mdir
mdu
merge
mesg
metamail
metasend
mformat
mimencode
minfo
mkdir
mkdosfs
mke2fs
mkfifo
mkfs
mkfs.ext3
mkisofs
mklost+found
mkmanifest
mknod
mkraid
mkswap
mktemp
mlabel
mmd
mmount
mmove
modinfo
modprobe
more
mount
mountd
mpartition
mpg123
mpg321
mrd
mren
mshowfat
mt
mtools
mtoolstest
mtype
mv
mzip
n

named
namei
nameif
netstat
newaliases
newgrp
newusers
nfsd
nfsstat
nice
nm
nohup
nslookup
nsupdate
o

objcopy
objdump
od
openvt
p

passwd
paste
patch
pathchk
perl
pidof
ping
pinky
pmap
portmap
poweroff
pppd
pr
praliases
printenv
printf
ps
ptx
pwck
pwconv
pwd
python
q

quota
quotacheck
quotaon
quotaoff
quotastats
r

raidstart
ramsize
ranlib
rarpd
rcp
rdate
rdev
rdist
rdistd
readcd
readelf
readlink
reboot
reject
rename
renice
repquota
reset
resize2fs
restore
rev
rexec
rexecd
richtext
rlogin
rlogind
rm
rmail
rmdir
rmmod
rndc
rootflags
route
routed
rpcgen
rpcinfo
rpm
rsh
rshd
rsync
runlevel
rup
ruptime
rusers
rusersd
rwall
rwho
rwhod
s

sane-find-scanner
scanadf
scanimage
scp
screen
script
sdiff
sed
sendmail
sensors
seq
setfdprm
setkeycodes
setleds
setmetamode
setquota
setsid
setterm
sftp
sh
sha1sum
showkey
showmount
shred
shutdown
size
skill
slabtop
slattach
sleep
slocate
snice
sort
split
ssh
ssh-add
ssh-agent
ssh-keygen
ssh-keyscan
sshd
stat
statd
strace
strfile


strings
strip
stty
su
sudo
sum
swapoff
swapon
sync
sysctl
sysklogd
syslogd
t

tac
tail
tailf
talk
talkd
tar
taskset
tcpd
tcpdump
tcpslice
tee
telinit
telnet
telnetd
test
tftp
tftpd
time
tload
tmpwatch
top
touch
tr
tracepath
traceroute
troff
true
tset
tsort
tty
tune2fs
tunelp
u

ul
umount
uname
uncompress
unexpand
unicode_start
unicode_stop
uniq
uptime
useradd
userdel
usermod
users
usleep
uudecode
uuencode
uuidgen
v

vdir
vi
vidmode
vim
vmstat
volname
w

w
wall
warnquota
watch
wc
wget
whatis
whereis
which
who
whoami
whois
write
x

xargs
xinetd
y

yacc
yes
ypbind
ypcat
ypinit
ypmatch
yppasswd
yppasswdd
yppoll
yppush
ypserv
ypset
yptest
ypwhich
ypxfr
z

zcat
zcmp
zdiff
zdump
zforce
zgrep
zic
zless
zmore
znew


computer science tutorial
seminar topic
newidea