There have been many studies on mining frequent itemset (or pattern) in the data mining field because of its broad applications in mining association rules, correlations, graph patterns, constraint based frequent patterns, sequential patterns, and many other data mining tasks. One of major challenges in frequent pattern mining is a huge number of result patterns. As the minimum threshold becomes lower, an exponentially large number of itemsets are generated. Therefore, pruning unimportant patterns effectively in mining process is one of main topics in frequent pattern mining. In weighted frequent pattern mining, not only support but also weight are used and important patterns can be detected. In this paper, we propose two efficient algorithms for mining weighted frequent itemsets in which the main approaches are to push weight constraints into the Apriori algorithm and the pattern growth algorithm respectively. Additionally, we show how to maintain the downward closure property in mining weighted frequent itemsets. In our approach, the normalized weights within the weight range are used according to the importance of items. A weight range is used to restrict weights of items and a minimum weight is utilized to balance between weight and support of items for pruning the search space. Our approach generates fewer but important weighted frequent itemsets in large databases, particularly dense databases with low minimum supports. An extensive performance study shows that our algorithm outperforms previous mining algorithms. In addition, it is efficient and scalable.