Pages

Saturday 30 May 2015

Share Wifi

wifi

You can turn Windows Phone into a mobile Internet zone if you share via Wi-Fi or Bluetooth data connection of the mobile network. So other devices with Wi-Fi or Bluetooth can use your shared data connection to connect to the Internet.

To share your Wi-Fi data connection from mobile phone network
In the application list, tap Settings Settings icon> Internet Sharing.

Share click for and then click Wi-Fi.

Change Connection Sharing On activation icon.

(Optional) To change the name or password for the broadcast network, tap Edit Edit icon, do one of these actions on the Internet Sharing screen and then tap Done Done icon:

In the Name of emission box, enter a network name. This will be the name of the Wi-Fi network and you will see that others will see and they will use to connect to your shared connection over Wi-Fi from another device.

In the Password box (at least 8 characters), enter a password. This is the password required to connect to your shared connection.

Led Display

led display


This article is dedicated to electronic displays with LED technology. On screens with LED as backlight also see TV or computer monitor.
.
An LED display is an output device that displays data or information to the user, which is characterized by being composed of light emitting diodes or ledes, adapted to Spanish and word derived from the acronym LED (Light Emitting Diode).

This type of display should not be confused with LCDs with LED backlight, very currently used in laptops, monitors and TVs. For reasons of business (to appear more modern technology for commercial purposes) LED backlight is presented as a new LCD technology and LED changes in the designation of these devices, without this actually involves a really important technological change.
The display consists of panels or modules ledes (light emitting diodes), whether monochromatic or polychromatic: the latter are formed in turn with RGB ledes (the primary colors of light), or various configurations tailored to the application. These modules together form pixels, which allows to form character, text, images and even video, depending on the complexity of the display and control device.

The most common uses for LED screens are: signs, information, advertising and high-resolution full-color video (concerts, public events, ...), this is due to its high resistance to outdoor, easy manufacturing and maintenance and low consumption.

One problem is resolution LED screens: While on a computer monitor, today, between 1024x768 resolutions are achieved and up to 4096 × 2160 (4K) in a 4x3 meters LED screen is only 192x144 reach physical pixels. To fix this technology known as virtual pixel technology (English Virtual Pixel Technology), which offers greater image resolution on the same physical configuration using some basic geometric concepts developed. There are techniques of virtual pixel: geometric / square and interpolado.1


So, it is available pixels and subpixels integrally formed by green, red and blue ledes getting through the mixture or combination of light elements, more than 16 million colors.

Wednesday 27 May 2015

The Digital Audio

audio


The digital audio coding is a digital electrical signal representing a sound wave. It consists of a sequence of integer values ​​and are obtained from two processes: sampling and quantifying the digital electrical signal.


Sampling a digital audio signal.

Sampling involves setting the amplitude of the electrical signal at regular time intervals (sampling rate). To cover the audible spectrum (20 to 20000 Hz) is usually sufficient sampling rates of more than 40000 Hz (CD-Audio standard uses a 10% higher rate in order to contemplate the use of filters not ideal), with 32000 samples per second width similar to the FM radio or a cassette tape, ie band would enable register components up to 15 kHz, approximately.

To play a particular frequency range sampling rate of slightly more than double (Nyquist-Shannon sampling theorem) is needed. For example, in the CD reproducing up to 20 kHz, employ a sampling rate of 44.1 kHz (Nyquist frequency of 22.05 kHz).

The quantification is to convert the level of the fixed sample in the sampling process, typically a voltage level in a finite integer value and predetermined range. For example, linear quantization using a 8 bit linear coding discriminate between signal 256 equidistant levels (28). You can also make non-linear quantification, such as log-quantifiers as mu-law or A-law, which, for example, still using 8 bits operate perceptually as 10 linear bits for low amplitude signals on average, as eg human voice.

The most used form of linear PCM digital audio is the audio CD: 44.1 kHz sampling rate and 16-bit linear quantization (measuring 65536 different signal levels) and, in practice, can record analog signals with components up to 20 kHz and signal-to-noise over 90 dB.

Friday 22 May 2015

Dual, Quad, Octa, core ...

dualcore


The processing module is the core of a processor. A dual core, for example, is composed of a CPU 2 processor cores 'physical' mounted on the same package (ie on the same container, namely on the housing where they are inserted the electronic circuits). But because we got to this point? Because the parallelism, compared the increase of the frequency, allows to greatly increase performance, although from a technical point of view the 4 GHz are easily reachable, the disadvantages from the thermal point of view, the high cooling requirements, and l ' huge consumption, are factors to take into account a lot.

But a dual core does not consume more? Although requiring more energy when both cores are under load, the dual core processors will eventually work faster than single core processors and this results in overall consumption that are even smaller than the single core CPU. Also, to save even more, there are techniques that allow to "go to sleep" when one or more cores are not used, or to lower the working frequency in order to reduce the possible demands in terms of energy.
This is not to say that a dual core processor with 2.2 GHz runs at 4.4 GHz, but simply that each core works at a frequency of 2.2 GHz.

But so it is best x64 dual-core processor, 3.2 GHz or a quad core 2 GHz x64? The answer is complicated, we say that depends. For example a certain dual core can have higher performance of a quad core because it works at a higher frequency but at the same time has the higher consumption, but it is not always said, because yes depends on the frequency but also on how these cores are exploited (also if, in principle, they are preferred processors with ever higher frequencies) and use that one should make: to open a word document or to view an e-mail message do not have to buy a quad core from 300 € (if not more). Conversely, if one wants a computer for gaming by playing a video game with all the details at the most, and do not want to compromise, then you need a quad-core processor with latest technology, at least 3 GHz (in addition to a good video card).


But then why are there all these dual-core, quad-core, octa-core ...? Because it is also a question of marketing: the user may think that the fact of having four cores instead of two can be a simple source of pride that leads some people to buy a product based solely on this factor (be it of computer processors, whether it be of processors for smartphones and tablets).

Charles Babbage

babbage


Charles Babbage (London, 26 December 1791 [1] - London, 18 October 1871 [1]) was a British mathematician and philosopher, proto-computer scientist who first had the idea of a programmable computer. In the world of computing it is known thanks to its machines: the first, the machine differential, was a prototype imperfect while the second, the Analytical Engine, was only designed.

Parts of the mechanisms of incomplete Babbage are on display at the Science Museum in London. In 1991, working from its original plans, it was completed a machine differential fully functional, assembled following the standards available in the nineteenth century, which means that Babbage's machine would have worked.


Faced with the high number of errors in the calculation of mathematical tables, Babbage decided to find a method by which these could be calculated by a machine, which is not subject to errors, fatigue and boredom that could be imposed on human calculators. This idea came to him as early as 1812. It seems that Babbage was influenced by three factors: aversion to clutter, familiar with log tables and work on calculating machines carried out by Blaise Pascal and Gottfried Leibniz. 

Wednesday 20 May 2015

How To driver update Backup And Restore Your Drivers In Windows

driver update


Drivers are internal applications that allow the system to know how to handle the different devices that can be connected, from simple things like microphone, keyboard and mouse, to video cards, sound systems, networks, printers, etc. For each function is a driver that needs to be recognized by the system to function properly.

When we installed Windows from scratch, the system can detect many of the drivers automatically, and install the appropriate. But there are cases such as video cards or wireless network, the system does not bring a compatible driver and we have to use the CD manufacturer or look on the Internet.

And also, as many already know, if they happen to reinstall Windows, it is good idea to keep drivers not having to scramble internet media, especially when the computer has more than three years and could not find them anymore.

But the other option is to have tools that are able to detect all the drivers installed on Windows, make a backup of these and then restore them in the new facility giving a few clicks. That we review today:


Double Driver
restore drivers in windows double driver
Double Driver is a free application. It is an extremely simple tool that allows you to scan the system and detect all the drivers installed. Then shows a complete list and automatically dials the most important drivers that have installed and we should make a backup.

We can select all or only those that we deem necessary, and make the backup. That backup drivers, can be: a .zip file, a folder with subfolders structured for each driver, or an executable file that when you have to restore autoinstale.

Then simply copy the backup to a USB flash drive, from where we will restore these drivers when we have reinstalled Windows.

Double Driver is portable, requires no installation. It weighs 5MB and is compatible with Windows XP, Vista, 7, 8 and 8.1 in 32 and 64 bits.

DriverMax
restore Windows drivers DriverMax
DriverMax is developed by the company Innovative Solutions, which are Microsoft certified partners, which should give us guarantees regarding their quality. It is a free program, but has paid updates. DriverMax allows updating all drivers automatically, and also create a backup in compressed format to restore them later.

DriverMax is compatible with Windows XP, Vista, 7 and 8, 32 and 64 bits. For now it does not work with Windows 8.1.

Free Driver Backup
restore drivers in driver backup windows free
Free Driver Backup is a free utility for Windows that supports and restores drivers. In addition to detecting each controller, it offers detailed information about them, and only shows the most important drivers that should support, not to have to surf the massive list of drivers not knowing what we do.

It also has features to restore registry keys, browser cookies, and Internet Explorer favorites.

Free Driver Backup weighs just 3 MB, and is compatible with Windows XP, Vista, 7, 8 and 8.1.

Business And Labor Perspective Of I.T

I.t


In a business context, the Information Technology Association of America TI defined as "the study, design, development, application, implementation, support and maintenance of computer information systems.

" The responsibilities of this work in the area include network management, software development and installation, and the planning and management of the life cycle of the technologies of an organization, where the hardware and software are maintained, updated and replaced.

The business value of information technology lies in the automation of business processes, providing information for decision-making, connecting businesses with customers and providing productivity tools to increase efficiency.

Tuesday 12 May 2015

Operating System

system


The operating system is a set of programs that manage computer resources and control its operation.

An OS performs five basic functions: User Interface Supply, Resource Management, File Management, Task Management and Support Service.

Providing the user interface: Allows the user to communicate with the computer through interfaces based on command interfaces using menus and graphical user interfaces.
Resource Management: They manage hardware resources such as CPU, memory, secondary storage devices and peripheral input and output.

File Management: Controls the creation, deletion, copying and file access data and programs.
Task Management: Manage information about programs and processes running on the computer. You can change the priority of processes, complete them and check the use of these in the CPU and ending programs.

Support Services: Support Services for each operating system implementations depend added to this, and may consist of adding new utilities, updated versions, security enhancements, new peripheral drivers or software error correction.

Decoder

decodar


A decoder or decoder is a combinational circuit whose function is inverse to the encoder, ie converts a binary input (natural, BCD, etc.) of N input bits and M output lines (N can be any M is an integer and less than or equal to 2N integer), such that each line of output will be activated for only one of the possible combinations of input. These circuits normally are usually found as a decoder / demultiplexer. This is because a demultiplexer can behave as a decoder.


If for example we have a decoder 2 inputs with 22 = 4 outputs, their operation would be as shown in the following table, which has been found to activate outputs a logic "one":

Friday 8 May 2015

mobile Broadband in USA

usa


The project, according to Reuters, focuses on the development of fiber networks of super-high speed in both Kansas and in other locations in the US and in front of it is situated Craig Barratt, head of the division and Energy Access in Google and former director of the firm Atheros.

If Google conclude that it is feasible and reliable a combination of fixed and mobile broadband, at gigabit speeds will exert strong pressure on its competitors, according to several analyzes. AT & T already has plans in place and has recently proven gigabit broadband in Chicago and Atlanta. And predictably the intentions of Google promote a movement in this direction.

The search giant began offering broadband services via cable in 2012 in the US city of Kansas City. Shortly after, he decided to partner with Dish to build a complementary wireless network and now raises the initiative to move to other cities.

The characteristics of the millimeter wave technology make it very suitable for short-range communications and high performance, ie the so-called last mile that separates a wired network with the receiving antennas of our homes.

Therefore, it is very likely that Google is thinking in this technology not as a substitute for local access, but to save some money and use dedicated wireless connections in the last segment of the network.

Officially, the company has not confirmed plans about the market or whether it is mere field tests.

You remember that the company is already testing the wireless Internet access in Africa through its Loon project, based on a network of balloons operating as a repeater antennas.

History of Microsoft



Microsoft is a multinational company dedicated to computer technology. History Microsoft begins April 4, 1975, when it was founded by Bill Gates and Paul Allen in Albuquerque.1 His best selling products are the Windows operating system and office suite Microsoft Office.

In the beginning, in 1980, Microsoft formed an important bond with IBM that allowed linking the Microsoft operating system with computers of IBM, Microsoft paying the royalties from each sale. In 1985, IBM asked Microsoft to make a new operating system for their computers called OS / 2. Microsoft made the operating system, but continued to sell its own version in direct competition with OS / 2. The version of Microsoft eclipsed the OS / 2 in terms of sales. When Microsoft released versions of Windows in the 90s, he had already captured 90% of the market share of personal computers in the world.

As of 2007, Microsoft has an annual credit of 51.12 million dollars and at least 79 000 employees in 102 countries. Develops, manufactures, licenses and supports the number of hardware and software products for computing devices

Tuesday 5 May 2015

64 Bit



In computing, 64 bit is an adjective used to indicate that a given architecture in the standard format of a simple variable (integer, pointer, handle etc.) Is 64 bits long. This generally reflects the size of the internal registers of the CPU used for that architecture.

The term "64-bit" may be used to describe the size of:

A unit of data

The internal registers of a CPU or the ALU that has to work using those records.
Memory addresses
Transferred data for each read or write to main memory

32 against 64-bit

The transition from 32-bit architecture to a 64 involves a profound change, since most of the operating systems must be heavily modified to take advantage of the new architecture. The other programs must first be "brought" to take advantage of the new features; the old programs are usually supported by a hardware compatibility mode (that is, where the processor also supports the old instruction set to 32-bit), through software emulation, or through the implementation of the core of a 32-bit processor to 'interior of the processor chip itself (as the Itanium processors from Intel, which include a core x86).

A significant exception is the AS / 400, whose software runs on an ISA (Instruction Set Architecture) virtual call TIMI (Technology Independent Machine Interface) which is translated, by a layer of low-level software, native machine code before execution. This layer is all you need to rewrite to bring the entire operating system and all programs on a new platform, as when IBM migrated from the old line processors "IMPI" to 32/48 bit to 64-bit PowerPC (IMPI had nothing to do with the PowerPC 32 bits, then it was a more challenging transition of the passage from a set of 32-bit instructions to 64-bit version of the same). Another significant exception is the z / Architecture of IBM that runs smoothly applications with different types of addressing (24, 32 and 64 bit) simultaneously.

Although the 64-bit architectures indisputably make it easier to work with massive amounts of data such as digital video, scientific drawing, and in large database, there have been several discussions about what they or their 32-bit mode are more compatible fast, in other types of work, compared to 32-bit systems of similar price.

Theoretically, some programs may be faster in 32-bit mode. On some architectures the 64-bit instructions take away more space than 32, so it is possible that certain 32-bit programs can enter the fast cache memory of the CPU where the 64 there succeed. In other words, use 64 bits to perform operations that could be managed at 32, an unnecessary waste of resources (memory, cache, etc.). However, in applications such as scientific, the data processed in a natural manner often use 64-bit blocks, and will therefore be faster on 64-bit architecture because the CPU is designed to work directly with these dimensions rather than forcing programs to perform multiple steps to accomplish the same thing.

These assessments are complicated by the fact that when defining new architectures, designers instruction set have taken the opportunity to make changes appear to fill gaps of the old one, adding new features designed to improve performance (such as, for example, the additional logs in the AMD64 architecture).

HTML Hyper Text Markup Language

HTML


The HyperText Markup Language (HTML) (literal translation: to hypertext markup language), in computer science is the markup language usually used for formatting and layout of hypertext documents available on the World Wide Web in the form of web pages.

It is a language in the public domain, whose syntax is defined by the World Wide Web Consortium (W3C), which is derived from another language having more general purposes, the SGML.

General features

HTML is a markup language that describes how to layout or graphical display (layout) content, textual and otherwise, of a webpage through formatting tags. Though HTML supports the inclusion of scripts and external objects such as images or movies, is not a programming language: failing to provide any definition of variables, data structures, functions and control structures that can implement programs, its code is in only able to structure and decorate textual data.


HTML, XHTML or its variant, aims to manage the content associating or specifying both the graphic structure (layout) within the web page to be realized by using different tags. Each tag (eg <h1> or <p>) specifies a different role of the content that it marks (then the <h1> will define greater importance of the <p> tag). The formatting in the text consists of the insertion of markers or labels, said tags, which describe features such as the function, the color, the size, the relative position within the page. The browsers that read the code shows the user default formats for each tag that meet (so for example the contents marked with the <h1> will 18pt font and contents marked by <p> will 12pt font). However, this format is completely under your control, you can change it in the settings of your browser.

When a hypertext document written in HTML is stored in a file its extension is typically .html or .htm.