In this talk, I will present a novel search-and-learning framework for unsupervised text generation. We define a heuristic scoring function that (roughly) estimates the quality of a candidate sentence for a task, and then perform stochastic local search (such as simulated annealing) to generate an output sentence. We also learn a sequence-to-sequence model that learns from the search results to improve inference efficiency and to smooth out search noise. The search and learning processes can be alternated to further boost performance. Our search-and-learning framework shows high unsupervised performance in several natural language generation tasks. In particular, our model outperforms a few supervised methods for paraphrase generation.