MIT News (06/20/16) Larry Hardesty
Massachusetts Institute of Technology (MIT) researchers have developed Swarm, a chip design that should make parallel programs more efficient and easier to write. The researchers used simulations to compare Swarm versions of six common algorithms with the best existing parallel versions, which had been individually engineered by software developers. The Swarm versions were between three and 18 times as fast, but they generally required only one-tenth as much code. The researchers focused on a specific set of applications that have resisted parallelization for many years, and many of the apps involve the study of graphs, which are comprised of nodes and edges. Frequently, the edges have associated numbers called "weights," which often represent the strength of correlations between data points in a dataset. Swarm is equipped with extra circuitry specifically designed to handle the prioritization of the weights, and it time-stamps tasks according to their priorities and begins working on the highest-priority tasks in parallel. Higher-priority tasks may engender their own lower-priority tasks, but Swarm automatically slots those into its queue of tasks. Swarm also has a circuit that records the memory addresses of all the data its cores currently are working on; the circuit implements a Bloom filter, which stores data into a fixed allotment of space and answers yes or no questions about its contents.
Massachusetts Institute of Technology (MIT) researchers have developed Swarm, a chip design that should make parallel programs more efficient and easier to write. The researchers used simulations to compare Swarm versions of six common algorithms with the best existing parallel versions, which had been individually engineered by software developers. The Swarm versions were between three and 18 times as fast, but they generally required only one-tenth as much code. The researchers focused on a specific set of applications that have resisted parallelization for many years, and many of the apps involve the study of graphs, which are comprised of nodes and edges. Frequently, the edges have associated numbers called "weights," which often represent the strength of correlations between data points in a dataset. Swarm is equipped with extra circuitry specifically designed to handle the prioritization of the weights, and it time-stamps tasks according to their priorities and begins working on the highest-priority tasks in parallel. Higher-priority tasks may engender their own lower-priority tasks, but Swarm automatically slots those into its queue of tasks. Swarm also has a circuit that records the memory addresses of all the data its cores currently are working on; the circuit implements a Bloom filter, which stores data into a fixed allotment of space and answers yes or no questions about its contents.
Aucun commentaire:
Enregistrer un commentaire