Friday, 28 March 2014

Quantum Computing

The future Of Earth.......  It Invented by D-Wave system


The Quantum Computer


  • Exploits quantum mechanical effects
  • Built around “qubits” rather than “bits”
  • Operates in an extreme environment
  • Enables quantum algorithms to solve very hard problems                 

    Software Architecture Overview

    Starting from the bottom, here are short descriptions of each layer of the architecture:
    Quantum Machine Instruction (QMI): This is the basic building block upon which all the software is built. A single QMI is executed by the quantum processor in response to the user’s problem submission. A user can directly program the system at this level or use one of the supported higher-level languages or tools.
    Quantum Meta-Machine* (*under development): The Quantum Meta-Machine is an abstraction layer that makes user code independent of the specific topology of a particular quantum processor. This makes it easier for code written above this layer to be used on our system.
    Interfaces to Higher Level Languages: For those wishing to develop applications, this layer of software abstraction is usually the most natural starting point. This layer provides the ability to use standard high level programming languages to access the underlying parts of the software system. It allow programs written in C, C++, Fortran and Python to create a Quantum Machine Instruction that is executed on the processor.
    Hybrid Mathematical Interpreter: This allows a user to specify a problem as series of algebraic expressions using MATLAB or other Mathematica®-like tools. Expressions are then converted into a Quantum Machine Instruction and executed on the processor.

    Software Tools

    QSage quantum accelerator: A D-Wave system works in concert with a conventional computer, acting as a co-processor or accelerator. This split allows hybrid systems to be built that can deal with enormous amounts of data and extremely complex generating functions.
    This hybrid programming model separates the evaluation of the generating function and the process of generating potential solutions. The evaluation of the function happens in a conventional computing system, as it involves a large amount of computation of the sort conventional computing systems excel at. The solution generation happens in the D-Wave system, using the results obtained by the conventional computing system to quickly hone in on better and better solutions.   Conventional software can also perform these types of iterative generating function computations (for example, using the Tabu algorithm), but the D-Wave takes advantage of the special property of returning many potential solutions at once, allowing the jump to a more optimal location in the solution space to be more effective.   This reduces the total cost (time) to get to  the solution.
    Information flow between the two is  low bandwidth and is restricted to bit strings representing potential solutions flowing from the D-Wave system to the conventional system, and real numbers representing the values of the generating function evaluated for those guesses flowing from the conventional system to the D-Wave system.
  • About Working ?                                                                       Quantum Computation
    Rather than store information as 0s or 1s as conventional computers do, a quantum computer uses qubits – which can be a 1 or a 0 or both at the same time. This “quantum superposition”, along with the quantum effects of entanglement and quantum tunnelling, enable quantum computers to consider and manipulate all combinations of bits simultaneously, making quantum computation powerful and fast. 

    How  Systems Work

    Quantum computing uses an entirely different approach than classical computing. A useful analogy is to think of a landscape with mountains and valleys.
    Solving optimization problems can be thought of as trying to find the lowest point on this landscape. Every possible solution is mapped to coordinates on the landscape, and the altitude of the landscape is the “energy’” or “cost” of the solution at that point. The aim is to find the lowest point on the map and read the coordinates, as this gives the lowest energy, or optimal solution to the problem.
    Classical computers running classical algorithms can only "walk over this landscape". Quantum computers can tunnel through the landscape making it faster to find the lowest point. The D-Wave processor considers all the possibilities simultaneously to determine the lowest energy required to form those relationships. The computer returns many very good answers in a short amount of time - 10,000 answers in one second. This gives the user not only the optimal solution or a single answer, but also other alternatives to choose from.
    D-Wave systems use "quantum annealing"  to solve problems.  Quantum annealing “tunes” qubits from their superposition state to a classical state to return the set of answers scored to show the best solution.

    Programming 

    To program the system a user maps their problem into this search for the lowest point. A user interfaces with the quantum computer by connecting to it over a network, as you would with a traditional computer. The user’s problems are sent to a server interface, which turns the optimization program into machine code to be programmed onto the chip. The system then executes a "quantum machine instruction" and the results are returned to the user.
    D-Wave systems are designed to be used in conjunction with classical computers, as a "quantum co-processor".

    Capabilities

    D-Wave’s flagship product, the 512-qubit D-Wave Two quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:
    • Optimization
    • Machine Learning
    • Pattern Recognition and Anomaly Detection
    • Financial Analysis
    • Software/Hardware Verification and Validation

Applications



Optimization


Imagine you are building a house, and have a list of things you want to have in your house, but you can’t afford everything on your list because you are constrained by a budget. What you really want to work out is the combination of items which gives you the best value for your money.
This is an example of a optimization problem, where you are trying to find the best combination of things given some constraints. Typically, these are very hard problems to solve because of the huge number of possible combinations. With just 270 on/off switches, there are more possible combinations than atoms in the universe!
These types of optimization problems exist in many different domains - systems design, mission planning, airline scheduling, financial analysis, web search, cancer radiotherapy and many more. They are some of the most complex problems in the world, with potentially enormous benefits to businesses, people and science if optimal solutions can be readily computed.

Optimization problems are some of the most complex problems to solve.


Water Network Optimization

This is an example of using a quantum computer with a conventional or HPC system. EPANET is numerical software that simulates water movement and water quality within pressurized pipe networks. It can model the flow of water in each pipe, the pressure at each node, the height of the water in each tank, the type of chemical concentration throughout the network during a simulation period, water age, source, and tracing. EPANET can compute properties of a water network given discrete choices for the design of network.
The quantum computer gives us a tool for designing the optimal network, by penalizing undesirable outcomes in the network such as low pressure or the presence of chemical contaminant levels, while rewarding desirable outcomes such as low cost, low risk, safety, etc. This Quantum-Classical hybrid solution quickly hones in on good solutions by asking the conventional system to evaluate far fewer possibilities.

Radiotherapy Optimization

There are many examples of problems where a quantum computer can complement an HPC (high performance computing) system. While the quantum computer is well suited to discrete optimization, the HPC system is much better at large scale numerical simulations. Problems like optimizing cancer radiotherapy, where a patient is treated by injecting several radiation beams into the patient intersecting at the tumor, illustrates how the two systems can work together.
The goal when devising a radiation plan is to minimize the collateral damage to the surrounding tissue and body parts – a very complicated optimization problem with thousands of variables. To arrive at the optimal radiation plan requires many simulations until an optimal solution is determined. With a quantum computer, the horizon of possibilities that can be considered between each simulation is much broader. But HPC is still the more powerful computation tool for running simulations. Using the quantum computer with an HPC system allows faster convergence on optimal design than is attainable by using HPC alone. 

Protein Folding

Simulating the folding of proteins could lead to a radical transformation of our understanding of complex biological systems and our ability to design powerful new drugs.
This application looks into how to use the quantum computer to explore the possible folding configurations of these interesting molecules. With an astronomical number of possible structural arrangements, protein folding in an enormously complex computational problem. Scientific research indicates that nature optimizes the amino acid sequences to create the most stable protein - which correlates well to the search for the lowest energy solutions.
With researchers at Harvard, we designed a system for predicting the folding patterns for lattice protein folding models and successfully ran small protein folding problems in hardware.

Machine Learning


When you look at a photograph it is very easy for you to pick out the different objects in the image: Trees, Mountains, Velociraptors etc. This task is almost effortless for humans, but is in fact a hugely difficult task for computers to achieve. This is because programmers don’t know how to define the essence of a ‘Tree’ in computer code.
Machine learning is the most successful approach to solving this problem, by which programmers write algorithms that automatically learn to recognize the ‘essences’ of objects by detecting recurring patterns in huge amounts of data. Because of the amount of data involved in this process, and the immense number of potential combinations of data elements, this is a very computationally expensive optimization problem. As with other optimization problems, these can be mapped to the native ability of the D-Wave processor.

Machines learn to recognize objects by detecting recurring patterns.


Object Detection

Quantum hardware, trained using a binary classification algorithm, is able to detect whether or not an image contains a car.
Together with researchers at Google, we built software for determining whether or not there is a car in an image using a binary classification algorithm run in hardware. In excess of 500,000 discrete optimization problems were solved during the learning phase, with Google developers accessing the D-Wave system remotely. The car detector we developed remains competitive in quality to any car detector ever built.

Labeling news stories

We built software for automatically applying category labels to news stories and images. We found that our approach provided better labeling accuracy than a state of the art conventional approach.
The labeling of news stories can be difficult for computers as they can see the keywords but don’t understand the meaning of the words when combined. For labeling news stories the corpus we used for training and testing performance was the REUTERS corpus, a well-known data set for testing multiple label assignment algorithms. We found that our approach worked extremely well on this problem, providing 15.4% better labeling accuracy than a state of the art conventional approach.
We took a similar approach to labeling images and used the SCENE corpus for training and testing performance, a well-known data set for testing multiple label assignment algorithms. We found that our approach worked extremely well on this problem, providing 14.4% better labeling accuracy than a state of the art conventional approach.

Video Compression

Using unsupervised machine learning approaches, one can automate the discovery of a very sparse way to represent objects. This technique can be used for incredibly efficient compression.
The algorithm works by finding a concise representation of the objects being fed into the computer. The techniques involved are closely related to those in compressive sensing. As a test of the unsupervised feature learning algorithm, we discovered an extremely sparse representation of the ‘Frey faces’ data set, achieving a compression factor of approximately 50x.

Monte Carlo Simulation


Many things in the world are uncertain, and governed by the rules of probability. We have, in our heads, a model of how things will turn out in the future, and the better our model is, the better we are at predicting the future. We can also build computer models to try and capture the statistics of reality. These tend to be very complicated, involving a huge number of variables.
In order to check to see if a computer’s statistical model represents reality we need to be able to draw samples from it, and check that the statistics of our model match the statistics of real world data. Monte Carlo simulation, which relys on repeated random sampling to approximate the probability of certain outcomes, is an approach used in many industries such as finance, energy, manufacturing, engineering oil & gas and the environment. For a complex model, with many different variables, this is a difficult task to do quickly.
For More Information About D-Wave then please Visit



Wednesday, 26 March 2014

10 Blacklisted Hacking countries


1. China
The Chinese may not always guilty, but have a share of 41% of hacker attacks. Just one year before the Republic of China was responsible for only 13% of
cyber attacks according to Akamai, and share in the third quarter was 33%.



2. U.S.
Every tenth hacker attacks worldwide originated in the United States.



3. Turkey
Bronze medal for Turkey, accounting for 4.7% of global cyber crime.



4. Russia
Russia is considered to defuse the situation from 6.8% to 4.3% October-December 2012.



5. Taiwan
Taiwanese are responsible for 3.7% of computer crimes at the end of 2012



6. Brazil
Brazil registered a decline of hacking attacks - from 4.4% at the end of 2011 to 3.8% in the third quarter of 2012 and 3.3% - in the fourth.



7. Romania
The seventh is Romania with a share of 2.8%.



8. India
India is responsible for 2.4% of hacking attacks worldwide.



9.Italy
Italy's share falling to 1.6%.



10. Hungary
Hungary is responsible for 1.4% of cyber attacks in
late 2012.