Without algorithmic power, it is impossible to become a master.

Algorithm is one of the most important cornerstones in the field of computer science, but it has been neglected by some domestic programmers. Many students have seen a misunderstanding of the variety of programming languages ​​that companies require when recruiting. They think that learning computers is to learn a variety of programming languages, or that learning the latest languages, technologies, and standards is the best way to pave the way. In fact, everyone has been misled by these companies. Although programming languages ​​are the same, learning computer algorithms and theories is more important, because computer algorithms and theories are more important, because computer languages ​​and development platforms are changing with each passing day, but the algorithms and theories, such as data structures and algorithms, are invariable. Compilation principles, computer architecture, relational database principles, and more. On the “Opening the Student Network”, one classmate vividly compared these basic courses to “internal work” and compared the new language, technology and standards to “external work”. The hipsters all day long only know how to move, and without skill, it is impossible to become a master.

Algorithm and me

When I moved to the Department of Computer Science in 1980, not many people were professional in computer science. There are many other people who laugh at us and say, "Know why you only want to add a 'science', but not a 'physics science' or 'chemistry science department'? Because people are really science, they don’t need to add more snakes. You are guilty of your own guilt, for fear of not being 'scientific', so you want to cover it." In fact, they completely made a mistake. People who really understand computers (not just "programming craftsmen") have considerable accomplishments in mathematics, both by using the rigorous thinking of scientists to verify, and by using the pragmatic means of engineers to solve problems - and this kind of thinking and means The best interpretation is the "algorithm".

I remember that the Othello game software I wrote when I was in Bo won the world championship. At that time, the second-placed person thought that I was lucky to win him. I was not convinced to ask my program how many moves per second I could search. When he found that my software was more than 60 times faster than him. When you are completely defeated. Why can I do 60 times more work on the same machine? This is because I used a new algorithm that can convert an exponential function into four approximate tables, as long as the constant time is used to get an approximate answer. In this case, whether to use the algorithm is the key to winning the world championship.

I remember that in 1988, the vice president of Bell Labs personally came to visit my school, in order to understand why their speech recognition system is several times slower than I developed, and, after expanding to a large vocabulary system, the speed difference is even more Hundreds of times. Although they bought several supercomputers and barely let the system run, such expensive computing resources made their product department very disgusted, because "expensive" technology has no application prospects. In the process of discussing with them, I was surprised to find that an O(n*m) dynamic programming was actually made O(n*n*m) by them. Even more surprising is that they have published a lot of articles for this, even a very special name for their own algorithms, and nominated the algorithm to a scientific conference, hoping to get the grand prize. At the time, the researchers at Bell Labs were of course superb, but they all learned mathematics, physics, or motor, and never learned computer science or algorithms before they made such a basic mistake. I think those people will never laugh at people who have studied computer science again!

Algorithm in the network age

Some people may say, "Is the computer so fast today, is the algorithm important?" In fact, there will never be a computer that is too fast, because we will always come up with new applications. Although under the influence of Moore's Law, the computing power of computers is growing rapidly every year, and prices are falling. But let's not forget that the amount of information that needs to be processed is exponentially increasing. Now everyone creates a lot of data (photos, videos, voices, texts, etc.) every day. Increasingly advanced records and storage means that the amount of information for each of us is exploding. The information traffic and log capacity of the Internet are also growing rapidly. In scientific research, with the advancement of research methods, the amount of data has reached an unprecedented level. Whether it is 3D graphics, massive data processing, machine learning, speech recognition, it requires a lot of computation. In the Internet age, more and more challenges need to be solved by superior algorithms.

Another example of the Internet age. Searching on the Internet and mobile phones, if you are looking for a nearby coffee shop, how should the search engine handle this request? The easiest way to do this is to find out the cafes in the entire city, then calculate the distance between them and your location, sort them, and return to the most recent results. But how do you calculate the distance? There are many algorithms in graph theory that can solve this problem.

This may be the most intuitive, but definitely not the fastest. If there is only a small number of cafes in a city, then there should be no problem with this, and the amount of calculation is small. But if there are many cafes in a city and many users need similar searches, then the server is under a lot of pressure. In this case, how do we optimize the algorithm?

First, we can do a “pre-treatment” of the cafes throughout the city. For example, divide a city into several "grids", then put them in a grid according to the user's location, and only sort the distance of the cafes in the grid.

SCHOTTKY

The Schottky Diode is another type of semiconductor diode which can be used in a variety of wave shaping, switching and rectification applications the same as any other junction diode. The main adavantage is that the forward voltage drop of a Schottky Diode is substantially less than the 0.7 volts of the conventional silicon pn-junction diode.

Schottky diodes have many useful applications from rectification, signal conditioning and switching, through to TTL and CMOS logic gates due mainly to their low power and fast switching speeds.


the Schottky Diode also known as a Schottky Barrier Diode is a solid-state semiconductor diode in which a metal electrode and an n-type semiconductor form the diodes ms-junction giving it two major advantages over traditional pn-junction diodes, a faster switching speed, and a low forward bias voltage.

The metal–to-semiconductor or ms-junction provides a much lower knee voltage of typically 0.3 to 0.4 volts compared against a value of 0.6 to 0.9 volts seen in a standard silicon base pn-junction diode for the same value of forward current.

Variations in the metal and semiconductor materials used for their construction means that silicon carbide (SiC) Schottky diodes are able to turn [ON" with with a forward voltage drop as little as 0.2 volts with the Schottky diode replacing the less used germanium diode in many applications requiring a low knee voltage.

Schottky Diode,Schottky barrier diode,schottky rectifiers, Barrier Schottky,SiC Schottky Rectifier

Changzhou Changyuan Electronic Co., Ltd. , https://www.cydiode.com

Posted on